var/home/core/zuul-output/0000755000175000017500000000000015155170263014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155206306015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000362031115155206117020257 0ustar corecoreO ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs$r.k9Gf2?xb}Wߟ/v׻%oysシo_fZsZ-jC4%_̾zׇϘէoW7_~uyi{|||F^lWo%vz_/-~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fi9qwVGtwL*upr"hA ó/碓@e=Vv!hXoCDiQJxRsL]+,=M`{0w|]?y#0>d,Qw;C}F][UVYE NQGn0Ƞɻ..ww](o./Y<͈#/5O _H ' C9yg|O~ €'5S0,!yrq%a:y\t悴暳unL h%g$Ǥ]>ny_` \r/Ɛ%aޗ' B.-^ mQYd'xP2cyڈL|Z΢rZg7n͐AG%ʷr<8 2S>h?y| (GClsXT(VIx$(J:&~CQp[ۗ/RIoӸUKҳt17ä$ ֈm maUNvS_$qbEY QOΨN!㞊?4U^Z/ QB?q3yv.اeIʷ"X#/W~^ 9^oķ[Zp0?K]UIĀg)4 BR4t *퇄u p}du nz/آs;DPsiv=HoN λC?; H^-¸Z( +"@@%'0MtW5oӿ":7erԮoQ#% H!PK)~UozxQV^peGVق?>6jHSJ Jno#ˏl_}?1zngbߧ\I;t.U&hoP~(*טjLo(}?=ZkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIj/h}e¥v`tYϞnTq&zcPj(PJ'ήYYFgGoQ ȎGxꆜd`V)F5d,0SSNK9ް4:ÒozsB<^/鄌4:B%cXhK I}!5 YM%J¶gʥVЇsfjҠƞo6xdy8_n ׫⹚Y˜cnPVBH9sI` v2vW G&ʐƭ5J; 6M^ CL3EQXy0Hy[``Xnv635*V I{a 0QiOEN_G{P;KHz"GW- >+`قSᔙD'Ad ѭj( ۞O r:91v|ɛr9lw`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ9 %lg&:2JC!Mjܽ#`RJX4Q2:IGӸۡshN+60#:mufe߽aY.hǑ sVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a__y[z`rVA,f=A}h&fFAtĘ5dw}EaޭVZ=B?qb;@sON_}붶yroy͟מse^^W3.W1'Ns0t;ELS0l/L/KcQ.os2% t)Eh~2p cL1%'4-1a_`[[z㧦lk˭c Ěϕρ_} Uwt `~ߛUIvl.4`P{d056 5w}'9vh;l$>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 @$;T-n,'}6ȴ .#Sq9}5zoX#ZVOy4%-Lq6d b}O$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::d\;ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByX/&Ksg3["66hŢFD&iQCFd4%h= ztKmdߟ9i {A.:M {bZo:Xko;$UYwS1dӧl 5Yp$'}Zv"ꒄℬT ٪ȿ$jXWFI#R޸B4vOL-LIP E&G`JS[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTn!tT̅Rhɇ ќuޏ¢6}#LpFD58LQ Lf~/EOFZ2;嶑, }t&&\5u17\I@ 5O? ʴ(aPqPϟ'ČR! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"IcƀY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4PE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэog&0&*m.)HzzBvQ0h}~5[ Cnlnݔn"?W1;|ۇ=sx+@{/*+E2}d0C*0)QsU·8t^O+mXUq6UDöbV]/)fpOj4r@mo$Y#/7$&5m8TK&cF5qX0mS换ohN\Uz=b ZF~g&? 3 pal58IS)`Ѓ' X'0e~t9ƟE q& z?z>%(ȊqmrdclSai|Dߟ|ح d#UjZܴ n8=%Ml%.İȖ0AޗuV3A7ำqA*\qb'YpuHƩlV nm=\ȁ-2=|5ʥ zi 8& s|!*pNqJهV5B5H:72% `vMRR'xZtfC& 1aH_ex& 1'v^ZZ4`9);q`F"dq1v>ժbLGd~MP5} x52LMF9 EOUDYpWԽ/F^dNyj荊UEg_bÔF˩ք5UGƶ*NX)Hc(<|1L7^9Kb A;sö!`0ݘ/+1L#B8U֕&*?66N{I_Pă"] rT [eTr؟X˰ ]R h!V ~˱>5S1px fk}sR|6d1D -':g pb9GgXӫ+al H d\kI[,/r`zR$@VM^rcG"E7\qtS:ڃUyy >Vc11*?x]aU`JޯAcL~|8yj8TR#s&Q.ϊ/Ysx+u6*27f[ǪC%+A~*Zآ'ѭnߡ|< a1sR T5ңF<qbiRX%d˜u -ss x%| ݴ9V= %8~k ^՟Sxa7 WS!Ea_sUfB7™:5[Y ]M뿆Xk3۾4/\._}a [j·dMKMaAPj|L"ͼ :|}z{zj\CU~5gcxLY<}-"˘/Vս.ߦ 1~u]ZTTꩼ(?Ykqss_^;uwuSתROmqxuO{_3[{uOq?y-|?gStOq?Ey-Lp_Cv .eϿW*3 `Ђ6-`kIf+;KRLlnV-f)2&?ul_EhYRczr?cOA;?U} ^: Rl%.~0}4t3Qf5xdRhEB- |q.ȃTs| w&5fɥ(ȊVã50z)la)~LlIxۘb!Pĵx BjIKn"@+z'}ŹrD[^F\`L♬[ 1ٍQѱ l~b;0C+oS@+.' U˜z+ZU;=eT|X-19U-q .A6/r\ǔ5&$]1YIZNJ2]:aOWvXH.pL6xMY0/M$ *s5xɍͽf3ۘPg5,VIRX`UbkocH&CLlv`ySc4ΔV`n\H+ƋC6җ2ct"*5Sct)eNq[ǪP@o`co Î_) SxF;;Ds'Xn [&8NJP5;H2ԶfRXC>he:,ա+e/)I0\l7oӊĭYȱ~qoWP~ RT4&+QR"tJ8۷)g3J1pnVGI3r %A%X`;oMxB[9=V qy)U cis^>й qPq3T&mFqZbRT1e8V SncȱLȍ-03cu0:U[tp^}{~YhilS&Z!֨řoҚ*HKX 6„=zҌ5+P1;և6UE@Uo1^'cplM)0\n n ߿@ֺ$9[? f\6f? )UJq:ǫf!NRT1.D*r0.8Q;ІWT-aQ]§K~ͦY;c؏UXE4S/8 7ޚc0ф ҁx&.L!UeL7-+Bg#[3`pO^>eoB4F\jtctUb)L[3M8V|&Zz/@7aV),.A[5T0G*-Zz_zG/S[*"꫒? `a?N6uilLnhBXULi7.1-XQk%ƹJ4^#ple;u!6~Q,e UZA *^Wif]>HUd1ƕ̽=f`\2%tJ4v[7"z1r\*&>Vv @/Cԍ)Gማ[r#nl-w/38nI*/``rO"mũɸHd"yc Pu>x2;W`VR<,aח&D<=j-Rר|r _;ǖڿc?ߖ]GU jt`N`Qibt r RvcBt1NPKTomD6G*04Zٚ?V¼rH&t9`?$& n(Ȋ1{\T%4Oᰝ'{6_Ȗ4zK`N lJQ#Ȟ^GM1n%8(9u6=~L+$1X$iE)tdK UaR^F,$<ANق4~g^(殴)>O 8`LOtG3y9yR S,."ev'}arAP% 2SRmlNI S7HGk}#4S3y|-=|]\ن+4ޯ5#]"/ |! ymOu1!z##?Gi_FO)Adqrw,2hDVy)bRZl?L]7VEJmŠK ѿTzi#Qu.|! Bvrb:O O@z'M}88Dҩ+۳:eN((¹PKN/Yٚ[tP*g&/G\Xu9MV\ȶ33U`ﲬjA3ǭm]A BA<ح|s~&rDZ Ǫthl/ b8q<2?eVE-#jň/D/SpPl+9n  ?;} =ñ88 e }wQf1h,%0﮽"P2?a;t$lZ]΍g| Bp`bz6ASTk[ONjvM Ǜ;o&/^fSuVt~rM]K*Kd~9 A4NA ?f qBt2K88/2_p@ (ݖS$W $_k/gLڷQy1A]r"st+ Z3?<|J3BA8  *WGl͈uk] OX_F?}x;4ʘd@7;@Q[~ gZ4r#iX@?_pt׫Mtduɣ2AQUyZ|Z(hHDux`8Yۺ p*7=t 욧 fYv}xѸHf@x`&zC ؾor9<8o)77_~D# - MCp-Cpp`^ÃI^dZn'MYL䷮CBPJrMv8@U|rPVY& @ikD84D1 m' 8R Rߢ}܁By_`HjUYm8Z B)9kY-erv5_OXxs${AR򕔺/3y:c:l.ʬ96)qt \%x_ $eo[Ov.ޞ< iY2 kȶaZJ^4 3eb)ˈ"ٗzü¨Ծfб3JG,2DI$ '$HvmT9W"- <(7GCkkGU85ykWk jƃz:JQyOsGx/ i!<{?OT"Z uSYG^I]P ({(X5Vm k]qu*lo_@) UGVd &OjO +kdR{S`&rJp]p*hZwteʵe_ہA2(ۊȇߊJ`">܂k<6xJkbf_I0yN'QYGVH!"j@sVU5byCsM]Uk)i0]U!e-g._kʘĚ U Leuɯ9ed'i A\ƌvHQl1`S {ά9M Dzx6]<Iݎܕ/-p %8ATy[59! [Tc^Owz:Cs.XՍ"PxTuR-0 δj,!pF&[ : Ew%FUGp$BNZvbE\qn\z%.ԪEkmd]YUԸ[#YG"{RY"im6x Vͱxctr.R E;noZBj$+;,fa7vd4+Nŀ=:g롥UUk! eGC>BWe0 爮ч_khtkhZw|6ƺFb>rFMyKmpӤ~Qx oAEQ Ya@H躔y ǮʭKvW5D6| Vzpp7+)CEXB*%-ղm({6*ʹYR 5+{6T4r됖bmm$IWE6 .lB 2qVnڸ}ߞg-L$:k+tj8z%d$dnY? 6ǒS|,{ RWhA ,<|vGSY<6\x?Fsv쟼gQY~h9JZ^U}cn~lC1U%&LꪇTya*2rἎ wTu̗0޲f|hqZ`٣@~roA|鞃biYdr48H2^,j^ߣ▹4u=yOxq=$~&taZ ga륛O_)o|gy[b4EN0j\ipE,DW Q5_"UU,ΦOjb?Bcer3ɗa&ʒ8{dg")(Jk.~wnvd2#_~md_ؗH2Q|b#~كM~]jK6.ydsvG!(>]Il?}5FɌ=dOgi}V1=>8{t1qFCgez7q,Ŭ&*o2qeb=qcL.dA,-MkE=>M%g7~<v*C{0c`80 E+}wҼeSMk3Gs+ko"?jrjq1{7m޽pP| )ꦕ.|^W4M=e8(΋Oe&.bN6zF%qQq6qگ3l\B.~JOxBSvB+V3f!7fxLrv|w3$=ۋw3.[X\l⣘^Z$f/H_Cr_Їs,RpӖ) 8~CVg/=r0 \uXRw-mZ`8s;wZ@v\QONڝi2l6KJv= )вUI#fMI$|up^\uf<S@ v{8w6 G=7?sS" p"P[ӮQU2-@:\go^.~R_>MU4@| ]ET MGTפDY鍎al ua?%i c,SwnY6M )bvBo7Oc8n<^b>Eޓj&;,u8,j?}~8,]*2&A>7և [&hR &WpYN ߽>>4__Ϗ_ttFο-Zy;jwBH<v.!qstIZ(HvXK 7ᘍP. kIqݏ Wi=٩W"ꐕ}!+nwA@D bӽN v#sTߏ)uopx+Ql+{! ^bCihP i7nu퀻 32a(խu-@9 щdNP55} Ny!"wag_A,#~=7~a: 0ZG4$t@YVG`KG26.]0 !1w (ލ7CW)qK߇M; 8Zٍ/(b{wbvjUn3 97^ n!(dN=ڥ-N[ܦ#3D*,c_:^Z WS,61x@OS11Z,h*mfo.#Cmxovy5Qj婀YO2Ɋ^8M܊#%rJ9S\O6x==y;{.1c%T\NtxVZCّqrTܒ#g3M*UCyao( @̢ Q>~!M[wSy0z\7WQ!)^NQf(iLу1 Q ",?`ue9|4gi k,TӁk(撞ڸ//>a̢)x-peOE4A(DjKӭ.m8Y |cP+?[j &hlۜDXԶ<@qfJbߩҧ@Dz=W|H⮘I,dYWՓN^94=7^ܰg1 y,0rs`Mڥlc3=LS(rx?y^w^.a${Cp`<2C5dГG֧VztA:J~٣_!쥌TYdT* tT!Qd A`~%5͚P*a++zėx5vojZ,a|.<$^E (ȭKU-n=^!֬ .<2WUPIyKDzz^CGm큣7Wg-FW I@<g-~kisLt{Ppq]Vo`ܤk:CN-CrC/ .;rgCXlsO{>aKElI 9#N{г(١=@Տa\iOg;-9aӶ6?! ,Ϟf>mAh j=6p0DHg#nnzi3 >B0ٓ}RӞ`Yp \m1L#'"7F Va Ue?r }H# cZ4`4q _ȋ<}Uׄ&t BJ!AunЩt^v+]_vZ=In۷w:˚Բ5eʶ#mH)[PIZjmHvP{ B #ސP{ B&:lG!MB5u'ݎPwCB5zkOކzkʛ5ʷ#oH(_PIoH qqK ?mJ2+k^:♱.O1A˻Lq2&Rr417Mf|F !rcĈi F 悠vYε^@Aϸ7{6g}1/^6F$*s8#/iIyywAl U昌3 yA<:;K/U֒&r>K*@ђ:Nb4Ϣ*Hic+ -L8 C!b<˓ w6htQ"[ũujjS!}]tow=毅C1Zz$_ 0BH-\ &;x7pFE|,I*[$_(.xBiW<e*FwN/pm<0)e ݅!eEd?\RF@TYo=t2Z΀yjp>>. ΅4Xƪ-Oݭ:u=0^,C饇+3Y,|m6k|<~>M3 'b"pCprM" KOp,1!.PX=կWYoDQSk8K 5e lz.DŽ_a$b q"D@C#wTl@1|n Wht6 ;e5>0D*=);)E@3.ERNe`EZv5^W0H3.w U,XtZN̒GY|śdf[/M^&pBF-a\"!_j=+% gjoUW7g 'SEJ+0tM2-S,-)d*c+`p6{LvcWrlLBS(k>PêŔ.wU޲1PthС ʻQm-}1~BQ,$$MЭW8b6S۪}9B1۩K?Ou >,}kC i9IqFRTUwKh,y2x!# aǒ5#xHF+pj5~ gq-0h %u0{ګTE{kT +I9`R@0 *4@Β8L.NAKdbe/pIzѾENaV}8_%8=;x7_KV|>~ioKp?\4DJC*$(d.;akd|a!~@h3W2: G8 I 'gco ~tyUuay{s߭)uӎUj91~سaLm1/\<)ga5kcxkc0c]1x%?~Kw+0 :Z8MoA,h2!>/`BHnă_ jedkZ}y☩Q~X#fz 0fjɭ}k!4k!묘 K5tTEm Q,(ÓQ%h 'n01*FQ+,!< Q;М%v74ԌwX!S.Ud$Yn*MmSz#)o ]};@YJ{*C?@)4?`)?q Nyf^Յ*Dh!F'bPBYg9,LO3!eeyBegyHcfyъ С&B-z6(cR)2yOdxl Иɗ`Y_Žiw f5JX WD#r۪r}WS nR\\Ye4O u.jšu)P圐[<䑫rjȰSW +]uf?3GCzE6 siG 59r)[.%]1kĘhVɢ^{n=wL-oq{YӫwbP%KCȜ1ŀ2}KK4. KVpHH@{Y~G ]BLkQ DHHTi+le:YǡLXuN5{B9!Em#/L껞K0 jL\oIzovU8#0$i}Pg `D,~jz͍88x38_nXjPh.d-d~MY B3/ Fb.NԳioTr^=o|$Xs.D";89by\d;jf*7Wř6kud }$ 3-riT2qUMo"cj4ɸkW)Lb4OPR[*H' tR Qt|~{zܠLVx'?>sGwL =;jٔ^鯸P 8BHJ{J%ʖ88ڗOLD$Dco՝_hA݇;U NoӷMD oqfק]~ ݰޚ%9fRE/0̻ŸGzyoȘ5okؒ{ͲW=Ĩz?3XNO31Z,ӏczʆݷZnK9\/kg=u"ҽooMv?Ή~O88~0~ B8sr˱Mg6K<)iN9KC(XPZ(ZE0h҇^"M-{4MLdӺZ;b'mE >8(v4DrXaJ" .&QȯG$ 0Q|{i@TŠ]Tw8WIHtZ<@> jxfs6ʨk1Pg5yI盧 v`͠kaL*-mI4(z:%  N[ .E堈*8n]vUFj8Hvx{xٿtюu^HN\XA`fx4)npI1ή_q$8~o='f΃I3:wTS2+"^56 m,_a::Z9(ȕA2ATECfxf):f#:x%ш&Bu\@F*pI=c/9R@PxUA8_sL6.BzT3YCjxye) QD Ri"4k6E L|DN bl+VCk֪w!i$i  j}_$u~-~{ YZ2 Y]J4wRN.tjU!ؙMũ, h)xFW_12K!!.I7˳y那Eo*VW 8kt?r$mH" ڎ΋ (zý׌ZcRb3_ə)PZ-o,+&jsI__8#]lU1ܜd}<=Zf~U>.ܚ'Qե T BEǨCA8ib"c`j`*"h5xFs_I mZkQʬ ?Mn' Pn"s@7x8QF $ -k>\ f Ȕ%V nlpvԃC׻8?#}Hл 9xiaV$>*^.8^*^6Oz0gub2hn)U(h7&|sbGtrDs->AR+p3h$sFP"ZEnmIsw)eGgVeIq˾ !y ϭA`.o̠rGGHA\"r5S$(%RK'P^x> V,<Ϋ |lڋeK]ie=e=kg6uM]\2%B{ ZAbX+1+L4@^~vW xʥ9T[YQఙRl?w>Z_6-!S{h˼hcpPPr 8mkKƲ(W:Gan-WӶӞ_{?xU֚PsR 'yTגcRi;~>nyQNqŰ}lOƅZ)xJ F,dXů$8"v{Wz )˗r_c"έ%uP\қyhm z@}rPsNd&IQN.?KB"b8xo9x^*!o`3Ac$.D($q gNx;|)_wO'zjQ#`]@5d%D+娟SW˼w7u-ñKF{@3|~Q9iҚƚw#S+/[RѦgl(mb_^I3$mukB19T(ͪR^HkyJ?[Kqå$lWkBaR &T/Me..dʦC;eWJɔUķ.8*Ұ 1Ѵ͛qN\(#0xߍH;s9{/?k3v0 VrtJmYg&\1ΉbI}~pfi<&S^֥`LxcK$N3G}?R7FKWsRVs0RѤoZ@!=[NT EL|o=)~VY!v=~G>WlK++ʏ. 6)KC#H>wp.vO:g/ &чɅ*+jegrGTGgmtD?Ek*η e6Gҧ!nʅB = ޒuAǚ5&*GT[>=1ւȅ5^ bV3(6UP1m9@C9rwAr0-JakR~-4F?ܹ.8n\:tIQz[m; ҕNa1>ϳJxXش6zM{.R3Ojޥ=g|f/_\'L#j# rTR,XFCdu"鶕P<:GV+n›$=tѸUuּ}RjEv ]ᗡwʢةf&zdæzv֗LP$.z{=.8.,0R%H5:h5X7xc +iELיw9z&m% |t6ĀpK:/g]SVW>C zf5G N  .Q$]bᓼ.hεP#ԪB9kf ERMfKWn\֍rݺ/]pnUngeGTQX3v͜`FNaa~o jI}] -6w=܇C ^R./E%^EkFwDA7*c, Nt4*->sp9EWfk(٬DT|SbV\ 43n7ډMMЂ\=(fCrΧ ) ze |zEo72ZwZ~wi]] ?ZTcr,`QG IǪv`\*F?NMf6 r8} qߦp:@ߝAoփNaлa"b肣tCIeAx mZsç^k,It{}{mVe/K;43N?ڗzף Nٯ IݔWi{BS7ٶy?bl#JL)Ss!LQ$6۷.8&j]ZGH#GD IebU39X͞TeS#%d^Dť]0ɬ?2OҿAq?3Wz nǘ$.ڀ`h;aH[J,.p΅4(Qס7/\%7*b8O˝7a6OB;%Z-.шJAPxx}fKs||pr> wM J lbM1TrHqkvE~ $Av?le"Աi%;glGHq0$х<|x )r{i8V;^6R;ۍYn#N]+1ERZ 5bX{Iz$U zǚzt{$&KO5q?5K 5!81 &(w:6b42pܝě>܏PC}/Υ5+<<1á7'5DiLL)z%$-%~^)?@ݝ38 !ZWϦݖ從l|^GZOw!Ge1 LYB '*7)w\<Ѡ^} ?ĺaf`9z3XBg~-e z,26G1tG u_޳:ZZڔDDr h U-0㇓&C[GPQ!3Ce| 88~n/,Xo]h6Tx{vY?ј<`Krp)[?x|3Ln̽M.!V}oÛO# tu]o, sY*"~2o}>8 &>P&W(]Bof08M~;7$ Ů<xcpx\=>hc9~]7͆gG@&I2w UcE( ?V-|%`w)3* b@e~ݘ%yZڛ஄նῪ' )\%Z/'ɄvLiG-8?I)upa%2W3\!UKRx0sA|<]qDAOc=v?+o u!S\^U`_pK^5fW>~՗\ G׬Nկ*r834%gqrr{GM`ĺǎWM,-~q*rDJH2$Qew(;3Tb|g8þuT_=gfN]x6 pjgNnɀgeI˾.LD A̱QpmtINӮ!B"kn0";Id|YA57Y{hwCGyj&;Y9G1r\qR3I[,u2⧧9ɁG'\_APgs&KY,VR qg~SDᨨ^wqe4=ki bf/Pk9C MzZQVu[?6dm` 9Nt4nLɱ%Vr*L=4-ǿ)J 9lwo=q}PzrlDS;w GVҵ<>¤a;QsQ rF'WfVh$**x2)!''sZ=/~t<ߒLxtlYNw{\;fh\ 5רJkeP6MM Vi% A,)&Bq.Xg 9Tkg\3YͺYZhļ͗f]dh] i@WC,]Ⱥ_- "b@~1!=KD+d͔:ͅ΂R2ev肭*ǹG~lm7:bG'Y+x O>TL*Gs wM>3@)vE9m6<}|jn~@Uݘe[שT-]ՖqJvϦ80\tݚJ4] kiu%+e96㛧֊2Iwֳ஄նovܚvj ~& ﺏMF8Bݯ$Vh^Ev {S4Uk*4#QG2Š|k^kw(f_`okhF ;gG>:bW1fT.?vxKɑk \=;c?KMl:@U\?.~,ϥ`U7>-Fz.P Y/ϖs/ǥg!}L2g0F{s)u!x! @:m!b ĸe.19Wԟg4}vi"gBYqmqTBȍYYK185k5 1=g gJcZkwB:PA BYO Ou1 fI6buHC oAL) P0Kdk(X,t`-Z "mt3҅bs;fw`D`0:a(ƼLQg!-یAVavs6.ցRkc%?d -?YVeH EZz4qT Q]x:Ζ ΂FqK!:0SY#F2ERԦ=h;SAdRX)7AzCpm#X9g암P:`RPt퀦$ YMa,XŸ L?\ᅤH` *'6)浂qkטͤ.brB^K'i)`Lj b`+.b>^p%B7ILCFAXYí{7z%"Hs%p* D7X[zo3X[\G%],ҥHaEF PJS,1>-xp DJ57B8&(\2ʊ{#uJui&m=jْ'Ń0X+PfSH7>8L~1t; )~%gh,M Yl) P$q] F0k8XJS (#s/I-3..΄K J!&kss 6_xZ nj[G%ŨE~]uQ\[a Q vr?J+&#IBlDEuXca;Q`3E%%%~CR%3F,뮮GwW}"KE 2eyb(q7+Y $%,rMIZ@M-BY*V ݭՠeVyO1z`j:hՙE-Zky΀;L! TsCƹדsU5=Yݎh' ղP47_(*/Z&I+FL 3烦*M8i2|J ;9+Ԡt\5h5j>jB Z6'ʙu&T9Wyk%:FH)yJ|}$hǜkS=.UVD ʙ։XK:BdVZthzѯzhpcTZ)  x [+y?BGcJp'0-'ޣ>.ljQCWjѪJޚQ8p ҉X9q4FuL*s8ۂ+[\r4'uPyv|eu9|7S}cN>aƇd˃ =?F vkt;\Az{ooMgXfV?(Qbzur9[@ėۊ\2cWkĪE݂B`cװ,l(iT4_ũ#mx3sw7o;I3g!Rjy_3`+ۀ^ޑk{m/{J\I}2=wYNOG =pT7CO zaPݪ^=WNI4ln[!LbvO̧*f] _1cv48ӕ9IQ\> E\\V_f}בl='?Jn]R&E$|`~I?3+?{9\7>:>?.׎f1eŧ׊nfvT8Bl?jGoh>@y\~w)Xݘ.^X\5쿭2&egSQ!_3^erz1Mo3TL"е:}WE wIyy~ɛ&'8Gk%e1B̫D\d,{uQz*j{K*Q/S.* Kyµ7OX|$0޽b.J³Og&\ڑϔ z񕛝ϳ*S'8ϋGD+i(9;?xz~&v;8C'=|֤2C7d:Dz7±|JrfI3/G6&,?lSH*xBvv6zQQZqO/l8Ͻy/R_q2J>~dž 7 8yi01ho߼ ~/Go7L.?On^mkabR]ĆhVwSܧ)Ѽ<ݦ)ey_ 4 8zo|sP*3b:⋇6wWM|(e9=dCڽ- VvM:o׈Pj PLN5XZntw%EkAy|~O"`9ɚWw5̊[/MXZ6ku *,0i\,A 塵5. A eL2Q먷 \,SIDi%ncKL+!ͅD ir'H 6޽#mjDm)Gܚo#miຼ5o< zGPk]3 )ZpJ-Hmm!Qunl4?qS]V:VOhfo )xxv^%^?V]$ޘ-~WY7S,Ms6LX3iI*~@SL> v1.5*" @ !=bCW##{Z%$IϾ.8gOyhkGCi-H7rZgk=__0vO.E=DK_OY?4.4 mg$8i .Wq-g.Qgjagg3^KY?34WW}Wm+FXUzWyN^4▌%鬘ٮc$4}ױvHb !:СlF>С5'W#o./Ӌjٮ5ڤ9."A/ u5"&@T4Rԍ?&  k-Z]o>}Ak]#|e*eF-M2{Ȕ3<ӆr%Yf(eQX ЋepeZkMj<~j\;ȴiw >9>X>դ&>VJj[[pGwP!Hy~KZFޯ'ߴ=m4I@L~LЄ`EAuFmMag%[̦eu|6Ώc* :,ˍ+ShR)YHuܖKXT g9$ve"<t*³θExEL$Ydj X& 8:|&H$2u"m(ZDb3'tEwD6r{:"ysehN@d בCծw{ټ\G=Qd *L>żiEo'k)}exM%AoPNʒhqS4M;4 n@DVʒd+ VNu }CҨ@.Y1v>"˃F% $ 9Dߥ)AcR:#%0V@N[!ӳ#W wH0;DF7Uu uI}B]kKQqj}ء׊Bl`bV\qYa"@YrQaѲ<4nIHXUYezGj:T-juҳ[ {J54v+Jڳi9ǻpJHqG`0]~JV6=<`t}99ǁs]kٻ0Њ!|*B`:Zjp}l^:ھEuXZ;׷jZWXb*ј<~ZQNK"Ϸ?(V*y6vnP:Gwcz⚚6xB=nTo4!O$qq$ VyQ <ӲZ\hw- 7o&@+穂%@B`1!je4aÂY{qHk@Hԟ$JYXsW- C,_X{Zzȸ65C82*uȆW"_?-eZ6.W3f+_rcHbʎ/JR2)֣#;9#\č(h1Eקf5}mkl!̐4:_ba߮|gl{#;UJz`q `8MC?B֫p=tK͖f4axfXU$b¼U{4?3|>f67Ɍ֚Lu&m̭}'^־oUDk־7NU}ۃt, s \هeFهap> N0\[#sUͺy=Pu@%[7GVcv/ꋶ><=\w~X}o)}[-}ջեEf{xś`]aXJ=?jۭ :aV3=3|üz&YgVx , U+m5.zSv&s3cDdt:}o$7Ón6;[=vy3:.\Zi=h V9JL觘lc3zLL?39 bęZ7r3y5mi)گ5=;_м(տoywQ ކxDކ5;8o#5c67[{Q~}ּ{OCONO?ڝ^]T7MbLvkڮ+z}-}%M\vsT5 {{?}[z}ȾEΊ(?BCf.:<_eXJe.C|㆛zIl|qs?E"-srkG}f Ks?-KuuW4oi'/Ot_zq?[N_r=^aP;{Y/LNOqMm3S8/A׋H#g~Q8Gw7^M=7nƏ~ݰs?^sN"yot;wVm$T[0Tm# ԲcMQzkl:׽C6WF(D׃RMzUv5S++f `C!Ԣ"D9ЦYTP0xyͥn:Z^+4;g~_FvNN`?8v޻ xW55 T6uΡp0culhl_gpBAti  ueJ6s G{A6.&RvMc7:f`MMF`jpp*#0rCsN@xeZ|ZI5XkVҬ֪3k%aH$v$zϵfP*눟1ӟwnoMSݬ3~Nsf/DUsFW,[<ՀA߼Ϩ_3Z"3Zks (OS>w%Uɝ*y+d^d^3sKǹ /!OZB%g4 뉬ZQsolqEU\uumn`>]O/ E W#Nm Nkp2RYBU=܋n5H)O>$>znIȩ^s$9md9 HIR $In:֞>8$fqbU֊9*҂cҞH qTG\1NYAXT(֪|tT_S4tM/J'k)T " RU&_3U~ zn%jq{{|nxLԼr-nﱺá`]|R@""գWqLmT,@aSPKzDj@OqlDE$QF%B-E4J 얣QuX2Pʩ"ވJ8*%HZF#IØH4b* @:RJAzH)M˄H5&xGyFerd* -njj&RUXKG6 ?w {)noq{zr4&Ӵ#S{1QWY|(FkryuCzyPW9rugٺzCUNYM^te&Ɇ[]ܶUWh[$v b9|ph߮߾"?~'O.~M"|n[qtv}lF~׿T'_c*Z]cVT{Q|E;I㴪z'i#O ^'{\鹳a!np "3*eYJ;b2p}mڧٔ>ɢu5׼΢Ã'2w^Q;FB ulTPSU= PF4"V*XwZGb:j* ڿMu:C ĒEI# s q6}6Y#Ƨu> aFh!$U%%4alְr{4!KĀmҠůB3ad1Bfq//bruELS[G #CG8 #ڽC<Cxme.]i\͕NH0jR!&,DKI=>髍*# Q%OSF#O҄7( S0 yblX*K&x"h \0MW\`!Q^ϽrtHI_2,/I XZ|y׾P^ohYF _&"ך _vϗw쥓b3ZĴoT9|&,vЗ+{m=?j` 1 礬jʹІ1@Ӵiu}Uhy~ J!Iq@,Ci ƦuBjƦ_T` U9"< eY8ۭVĮV^V|?V]]FPpYR{Y4ȈJNj=P=1ںl9nq0Ftr5md9Գg{.a%~aÀ-j.DY4@ TwP/TÝ, ^K,b`wQ*ɝδ GޢB#yd>id92gpP+W:r澆^]z%92vÔ+C٫e.ba+'.??kķZ⇗X? Q<鈮c&̓ÔAL s  ZJ;J\tmUDuNG|V&i[gBbHņBd0د mXiv1FA + {vأׄ*57QC*AWamrZ H<&05id1. h~'Vsp%?܅`{,$Y;5Y.F/SUr,Qo $@3? (xr!lkÈB &,|: 5BgbnFdѫAdؑkd9>&7Us9.{.t-rұ~fRBNαb&,[й.OFKv.u#H#z.,hLY 4"h< j0)x.4vTU{kaoyN1LS$ ZF- X"uTme!˾"%}xk9/ #AHESs$]߳g ]_,FfC'2f 2id1 L"pey?~^HR'HY0_2,!ʲo !mWU_2!&/Z./MQF\(a訐9)jjrPjC鑐֥\žBL2'BA!{I#HfcJI:ul̩8-WٸR("0kAf 9 #KytZ%Y~Cl9%MX3{Rf>@Adi,gt&FlEE-Rbv5@%3'`IfY (u.@Q VTsZ~pryaϯ$J')c>?BAjbPS|90&Ʀ8rwO9:Xkj@%PG0(d;ҧM;sz4P3"JyLlŀGr麮*(/K{P.8AmIv2`nd f4X,D`V7wɢd+%&-dXjAJRQlohv,[m3jNSy3j1jpKؕ{:j4r5ίl p,WߦB>a&V*^LJ 6Wg}ob?xjH.g $B2?L^z]i opo^á`a'%\\m T:@PF:o$DEIo ׽nhv󉽵 a쮳e 5܈ m £Q9^+ XO;LwP?w(xҠaQ~~w@4oYkG#h@Z趵gn`c[ JXp^4%awZ;LF 2I8a{2TakP֪[ 7Wh V?Z(x Q~n/ |*Ó.zXm vB %RV~5ҥ遷gwkrTKhiڠh]YC #Z"lmxdB[&xL;ɿa?BoSv< /hw֯iλ|p~ZYcsS\%Ia5<uLVN*.Υ~lՠ![eCp.bt-f&== {t(/dPĴP- |JI \EL{y>Z*cM{Wg?%y Ɖ)ꑷP=PjVj%9  h &Q SmMB~tNMa"Xi$O<`]G'>4e)xLgr=XA"7\8qFQǎ|ʾ5sXٞurk\'R)Q|$pVMkŎ'TphVIUfY%}֫,2J&K"`TNkTjm*iJڬ6s`)eb-XJ:ǎ`TqMc؆>kc+k υft2ɏǾ]DqaO̦B]tkM/ Frɍ"RVo^ I,NN ]CtIM W |.[xVbG+7O 1VSc 5F4cxM6ֳ샺4:&B)U(`!D(oYFAjR"8zYOAJyB'8HxF񑆃IT}mas|哹<;B븼C`i{7fAВU% ZЪ\0la]`x 9n'"Ԃ|aQ=DZI)n@E7~њ-U*CLn;i='mo}P1\KxۚXX6Ɣa\}Be9 $zXh]3k=GgCq|`A1TA /%*cU!Phd׋| w3Pd1C'G6Qd'q9Ӣ.*y0qQG) L)rk+~r/?Q`M .anA3|55&ƠOco5 ss3H tʰ5t$ߺŨ^U䋇DY}zAOZcS!ha8DW-U?CH]ĤD'3; in'3I+K+QؿC٢ne9F#C[ 6>,Fəg^aRևW!ͬMYD|:*4J94fQc3._0|OLR!$ՑN2~+;3?ʰ.-.l@)?Ntk>wva|  r#L[}?A+*s^R'y \R*%J}nM_|h^c2H?t?TUP>\%e\T\^qq2.ƇW~Pw'CMLM9y1 mH9 ̨ sK_$&'<5Q`-(yԐ碴\217[DS]:HS\6yc$TkTovo@ b UzXzБ_~'+6g"eu>#8Uοx#3bJpU F^1(gQ̉-AFc5sP]2CHf%E9ЙP ) MuMqx!gT}nhB CEP VTh8v%%a,]vJ sJzX$,![{K”s#A9 sd8!B[\gIq9.l6'ed{3,&"R^ЌG]+܄(d'10O+eϷw;Ik;. [  `Q VnjzwSB&#&9|r TfE}Q\S1%as,7Cx[M֜.E1/ x(H^Jࡠȁл}{s(U$lCwCuxx 7;RVM{bGjz_c7EHթ?n۾a|=X*yVI<65# U7W'yW~~Nu8`&Grh {؍̨XSk5"*rZ͌a|WCvq)&[s!\R&U.Q<(2"^q#

*D_[DPRy|U,ϵʝɕJbp$ σyhM(# Dآu0" 1|Oо/~tݳ/vhÞNB6)Zy(iR+NSn +MOr\Y/EUi%ͬ>WBͺP,dIΌ&rRN+i}_P019j ڎᬶ1S uM)z>@rG&i -UFXM.***Kh<{  *:v`:5ã&P܊`m@mNdA X?yrx*E!edNЗDpqʃ?E/ƝCLE.&jH-6 ^{@#B zv)mAj z1i&! ]%P@P5} ((B lpU#m6tu B4}t *'#Xjy3r@m- FbkJ fc͉A.mtRœq ,FJ"53KoY R)LUr{EAދ]u,'ԿIJht_5ji٪e璹6`8X砍yy^/:Ύʯ—hxM(ۜkZgX#FQg WP7t0LSnùq0 Pp RdY@ gf$ʾ,;jX6R ,<VPX?i,rhIg΀zQ=ʠ{!1?H(A8^N{*XMg!GlPb.0qn +[26?pi?ֲsl&ِCL*VKo`ebBu"Ͷ݈YŔp:xB. 0\NH56Һ5ޮhx;Xd3BG T GT6e]ōz*07,=[}ɣ6r娍$ ~.`ґCQG^ߠ+>otTrM)˔>\_}oi mF/%.7Kofѭ;7o6/o='*{,Gxz6{)VL ocOnu'޺.9LO^9=' ` Ʉ: l:䀘_P](14wNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNL(위:|:7)f#Zm^Pۅ:B&wB.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B+|?'x>Bf6B u B?Pv꠺I uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uPo+k滣g[ջ60ܟk*%:zXOKߜ/obq~@603l+NTo%Kw\fDl~Z;mpm:oS67{xsw"f"j;:R m"nqY es$N@igYVp6p*p-SalÏ5uqxqȩCMėy$\'C1cm̭Ly)&DYqah7$кU8֭`I1E5zMQ"\5"+V!Vul`!* \z),+,QꫢH$0Mi/5%gd,ͲP@utzW;kNCɸC@z+)p6DB#eP@dEKrSBWeUDsxn:A(,XYenaQ>TK$*drAI!0΄="T+7UlkL&4/ ¹{z[*7!S1IDC4GQF˽kߢ#!yD)$h~CR H62+PS~6u1j\ $mTs*Kէ:]ˉuP(RDJgUp5zIuS(:h_CXkuڄD:['0(&ĥ**ɀu2TӉB95}&,~N}[gjƖYĴ<.daY 3 7^#xHu`Ky@9l9@* iP9|4HhtU@sc $(v5̓ŋNn J )J#H VWg2g'͵"`z `-zOx 0|IGPVOvkrx P, S`ܢLtkUϳ5 V1DUya[&K ]V3#Hb b~~ Ooh~4Kv%6dY3JDܱj!hɵ̓x!z2%ugkf{t@ELL*PlArQ+HB]jX(a'j**DŽpJĪ)INP g[sSv`.:9MT&(;&6H\ 7譨T3m,e`ԿEn%,$ |z`?Rmp{<.ce͹+(d0uq&M 8zT&6R:`O-UN2aYg-GH(Y-:p4vf c$ @y~ Hߴ(-P5h7LJxKTl0RbHz(GȍZ}*"ڙE'%\U0MFAv L qYPIO# ,z@-[md?[oQD,qC(:jPjgc~{A_t׻(/ dZsL ]ehSeb$dnbb|sTuI\SMF% >UIIYc2fnR<HZJ&=ڟTG5dȵc$!Ts6'kMGB_v ʷz|7)"H\Bm;*އ@(m@T2'ТScL4>ybٷR1w5ҭ A<p4T{l,1e\O˥>t`eSFuL!I)g $!8. #f J#׸ 9/еܟDo2B `j?l^^ܴlw[a{ęSiNZi/}&;'jt BS v NAopU_o}qy{ v'<{m3߿;Vӻ槓%;qz^|= ~ѻ̧xsn~5iw3o+|~GR9m:y7}'ejx])*"u =:ВnJ0:aFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFky&Ngc\cWcZk^Q(4X4:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èb:Nɨw~=FZQhoXrtPJFߣQ'05:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èb:w3=k}'hi{fNsz-?^n {~moՓ_nwڇr~L;~r'"~%`%􅏕:╀uul_(Ӵ!@wdeOrFݪ^X5Jk+ՀJZŐ\+U t-+(oq'=S?^ ؾ( kI{+۷ذL[ X!vz-#kIudUXKxY^ @Ny ֝):lJj5g)+˚0x~f呴!0Yut-~`z~zZ0ZBk?Λ O js^iThj?'PRUJ؜%K) YUZվ!/Gï6iGfcpe+ףs?K[s{7]ߢ{Ew5tZ/r{|l#f?O7/N$O__с?zˇ _I;O>JMQ% HvA^Mj#^?77VRuƭ^q7'Xsё`ɑ;u]Ǭ2h-LJZF:'NV9/`OYKY1a,QB8"`+JjOʪ5OV{XFy^ X u=^zz?x3/z|~?|0J]ܞ[>͇5XӥݾhV,V]*/3߮fot0~g_hJ~ė&=̈́}MԞLM'9?#?ݩ}6MBU;.^MRQ@"C<_1,3eE͟wd86_?GJ+nz?G.S"Ԩ71Z|WdSm`zonc)[iGI7)A|WvorzVHY@͖շ䀛J!yc9$1 ^޿oߗN11jT ~5*ڦE)ѽA-ͩb?Ӵ]z+TZ9nRզ B~p:P(!XcJ*K$\K2ې7֞kS.T2nF JQkdZR &IEuh KPXjM$uMr\t $&IIu]koɕ+?-iE0 lL l2"=&EQ[Ķ%l6OUݺ:ucN´$=r[:!ɌaT Y5m-A$jͱ/ J8_S kG@8`Ƭ\O&l31dv+USA35 `"c14k`1 Vgke ltb~\d1N `pzIїQta/f!MfκHZTY+>m\'S$үp)6)1~sYe%VeD)js 3FQV O# u}-红wT>P x A?̗QuG6X#Q/1BH8jTRB|R]قdϲ DbP V)/3*m3$<ݮՃXOCLm]LՐhE1h6F80`= 9kFW"l`\ P1Aޅ\*@Q"Ed$e`y1nqAn! U Qq @pȃ s^ ϠB.ds,Ky=*g@cb- FN-ʀGhL]Zc|=$@frP%F $OETARQp;H-2r,2FaE^Y`H#́;UdO'R% fM[rZ!gѝ8M`Y5 3K7 \YU[f~/zdVA^!P̤@D~^3~M:)J `^ \:8 b1w^yVg NiW>l*Awkt1/;k(3 Zbe -骅 ÛIpd*lt&ݶ{O(xv`)jY:Z 5$AZ\UGCo dzx4 _;ZAIxSA9njh|(3dsk$#WWGKhLwzB #RT"( wB)Wn(M f+@1\ƒr<Pb[t?" V1 dS! !(-J1<@Jȣ(IUk01pԳTuL\c#d&jâHYa=+NF"5R&0S79֨Qr`qQI<(2SA"0ՔqOp2ˇ{H8? -zp*R3(ѫgi2 G`93#ae%3R z΀xQ= ߟFpJ8nh XO|Sdz<> 8%mC3FR [2x {@pi0? jfC1IXU\70wk3ԞXr+d{v"`ڸ=; }"`id&S9-&=*X{GTLXΥ6ҳrՉHy`M=[yX'p]n/PMDH''K?Y]S4t^/&c+Sq;剀[{pw\p.܉S&XɄ6,uVz;3iϓCzSV%mMKǟi[;0gt0FzqFky^aӸaG)4яZ`#v/=j~hr/+{G{Qkz%sKtO]\oN])A{ş"o]iɥ'4ɁN]ËO]⩻+B)TsWߢRfnjt -#d`!񏋼I/޾>΢MWl蔔~!R yb~x6YaH? 껿>hMgڽ[f鷯$gӋUfeI"L57 vW=iz]?utj1+W_]6v9Ii߽o>&Ob66F:AuŬ)e}^g?3^\t#d9.>f!0,=ج1p0?O[wǍ "Wjve/ۮ́/pRuÀm <1dvzq]-enctExHOzl5O1vU)rV]rN]Qˡ$GYksFV:-!1\l2E|;4˞A1i>F\ 2ĵ󽍊_e tvn.rXa;bo }#?~eֳQxޢx?Gu:6x)n׎}iZ>,5t,os6ao+j{Ax e0_Pk4dwI 9PE_?ӽV>Lar$m:OW0z1>;1-.gO ޿)~"(ezPLn~No ?r>Gfzp7cjvx?~ptȵ!޴‹>^[mf*r#CݼtmPcWwn[.g.ov9{gV=Yd[&Rjqul0J٦c˫~I]nlvN^xӠKڞWB(Zgw ;'x8 ƮX=ҿ/,*"/[!p\?l~n"Sz^&:rHi=\ǤT|p],.t}R(t -i1cC%O|Cy Í㐭ݕ"1%[ߣg>ʝsA*穚V[[Ak+hVܞ$;{2(5Y{*(BSDʽMHtoP_|1߬{&E}>^ub)ּ_UO.fa^~^ͧ,il~{慮<.M\ !t2Ȩ܁]KS#I+mskx?NO$Rfp Q " ;-gu#'l_Ŭ`#cvYp7iMwDy_&X4vJhcl5x-OdNTgcqh< |.4xqQuJ"_@,# 1=^!RT"9i3 p1äEw`\mVFEaik9)L*RzU̲8y: :\f"`H\>P\A tajR%S>0|2aQ&3h*|.|~=b~vʺGQ/'GFq<,;=/].+’J!A9E"B$׉x~~! 9oj^֑ҝ ByGHϷ LYU|ܯ sPuLx!C3.,IP%NrN~e*p] QQ"% 38<BtI˔\? H l1Uɤ4=@G/cx Ǒυ\Kۋ9*EW{3$gglxuotc ASCb4es Y1ЦHiǬNc[3^=|.'H:0;}ƢDg{VhgCM>yq2g;dR\d<]*_`OpƐŹaYӼh$栂B "#H0EK\I2,fY\dhi6k|_l_/DӫcҕBB  A ? V1t",v Z=?w\JhHL KjFw-Gw0K@\teW袴h+ {{ø瀘֯vʛQXDp~-)_HǏ}=mɩ8J{Gsy+?~;t$co3_@6 r+J`~؜+0^`lgcq2`zt" (8H:\R<^+ODpFΝ(a/f_L|[ؽ\N21[=aVfMj *)S'{Ζr|șۥ $-Dj`nKHRMZJ%a|. sJ>gv.eHJL- ohT9鈌~ KrfRiz+A=*(1U7|O ; FFg-`vy9om׏;f%H+hY&(i_R8)τ\O) iܻ7wa\jH5VT% >8r0 Ы-iaOG~y1 6 S2/zIMGڷV4죱C&uեmY[#+XDo* >K v/ìnaLFjMʯ{Jn*lr:UUgUT,h)" jêO؈(\/ۈ=[ U$Y ڪ$XH60A#p^yqT< 2|.(GЮKd͜ q0vg:cF6ѱD?a~|.b9)phdrMI 5?JL-UT%8q !x =H([r"wqz&^kSak>R&^G&KyüwL-[*]Opס#2Mj?jR^7EIrHSQ`1X= # ӆQsmX8}Urӡ~ M+uȧ&-xAe侗*r_-6 V'e6>v*ИStɧįmZ?Qꇊ+(s?7,d _hɨv ]̦=Gp&?k&9hPF~\k<t`2-dDkc' Lyú]Vb [ .wĬzQsKRN4i^ )_ )1cZPY݌7?=)8 1h@Q1Wa8<É5٬%y D:IYDv9x:{r1_̳Y;(O6s}~$3 ض>^WyGn~ڑOD3cOOY]Z$%}GpzQs \Ew$f'w~da gt~mVnŲOP}._5׷WOIwg;qTO2g"J:@\O֕Iu" hX+5UF }{7@u˵ܸ߫] $8354p+\(y2 80" Ay|Oh (2""TnM:XrE:)z|mw͋j{aqw6M ܠN%SK-kz{Vk̈́Q&`}Q-F(yh7jšdM<-amV{]j=Jz˼'aO hɡD"3>JO2g1]7q;Y:4CPsѵK np]w͗Աnh|))boOC ̣W,rX4ԄQ3Av9~;e*&,_rn:Lhn:"Rꆊ',i %Ev9Ft 4,ff'TĬ?Ps5fNr&: !]I<{Z"(`SazG)" ix|{ӣދwq}6+VQ7/%7eJ7Rm`iP>%tf[V]|qc9|. ]/5͓sLW,肹C7WqƌKqbrdmmҁQ:"Gl'a2"8o8, T8eDFŗkG~ }]P37Gd;bSmj?q}uvzFE?Մ J(qZ|&\n#2:n$]5GWǝ7>NɆ O/v4xvgR !t!9* "tG8Pdw1LxDFdzi HډZF`ҏ28 AjY& gC6G;f YSj\#:0^56鲲$:r4 X)lI!N!49y⼽(fu! [_w'B@MDpF] 18z<^:Lu&`St0/]S`Px0{qDFj饁N4Zס¬:#nEᄌuDFy1FWzT2zd 5aDFdztˑwPVTAYJMI%(ck1 0(ږc1'&i Q%i&X+-1%í"füØ);UDnDxk$l\Qpƀ1Qn(" ]A}y:a_v żޏҠbuj9"+F%8S_;aZJ\6EXnTwI q!SP!Xz?dO-$bõ( s׍Ҫ2Ө U W[-cv{.9at7[:_l˰ɏH2-\z\`Wngjv{U-̻ gXZøYs ARO 58U%V&0TQ*A^%(0s\ك䄒ΘAm&)E+|\Eu^yqÝ" yCYpDk`КفQk9Ψ/ݥH̺a|&nFN?-бjENZj>3.v4F(ݼ䔛)M^RoʊkauC3;l-Pn푖`qFܗ [ln.A2n,(vZm;{f߯(ٖ"ݦF]bX҂K1/>3Jb<"ť{Wa=UEpW\DR g[oYmJ/ܮ_(3v$7@ei3i.wXՏK]0.,!ev3d:akb-dC YQ,qF )|PnΙ#fL_J?Asϳ}4'g{~CӯOw&9UUDΨ sΖqlueֺLMx"U܅➀&e} QNgaCID(-/ŜAv.cƀ貜.g$+3WE5Jy~:x:,Qzì^ͩhﷃ??&k'%fcxZ 4YWB`@nl~8yaS>*>w/~U˰^zFbCt<8]&V֯!lz{Voۓ}%<.K*Eɽ6KXk4/}aF(|y)|§ KRQgtIzXUqxn6==V&7,&97wazS]T3 fifgzNa(/yڇ?j=ѮzS#ϵ8V-y }!^nYԾUk~XSDoƎ&r&F?9X ~ [g0ڷ/Ss~~@4Pt7]9,fy~]jq{H7_-{C4vwx|JLaM]7{`0+@EnV~ Y4;h̾+OlSb+&>Vt|{ V c͖!ʪl~u @w,}&tf;lc96L[_KTנw?/#X3~-1UxcDyLfN({W=1Xo[a>6 \]u,1/@]y(Z9LBøF G`ݝ1$ƻ9+|{ӶWW=keY6B1I.`'ZI &8q F^!AEEcDU/jBnrq rt=k(nm;z<x[܏a־[En JSlW(bmPDc`+Ss%V;XTքUGLp9)݈ G اYAi{x> PA%ڋ^E?ilB5576#.`p%2E3v\WԖz"uj*h+8d+*>Lݚ!No]0 ]6%HV`aZހ"*e%eӥ9VUV?P´A E<Ҋ16/ISNGv;v3Ko 0l*Ud l(@&*Tk/.WwC+16(:d+S0&Jo`\lY<dB k ..fWn6i̖J2/&$1C#J?5Kb_$zBM1 ݊1ʄVS\\%yTKK 1J3„ RIHo-zKEm[j{ %Vqh)]UHAUEڧXO`"4FEI*S(gV*sm}jdFeh,iU'`##;Ӓ7IvT^ xⱵ !AuzB0Kq xݙ.8cTS̰D! 2*9ąrϩr4 mô E< %sH0A7"*\q ]xi6~m:!G@ˈkB~Txj+әZb~Wz"E1lolNc$@qZ95>ȿN~~yXQ_q |$ C-Μ,Q0 *JB*nb|b>RCP* 1ZmcډWZ2V&}IiS0K|6O%іz(Q1VQ% /dm ́c&eFºsaXx4-zEVY 1Vq5 -S6S)@YE@_TD[HBo(^b@Nyv钻wH x@ddifd)I5ɏDc+ ] %.-9#z$!ϭ tE;֠Ֆz=#!.,: Ɂ'KG}JC_C>@20,c h]{EL] *BQ)*wX|7O̽}ѩ{"M,U)Dvqu=1aң˳PRUT&lB?s1$|A<#FȠL&3%ݿ|z&_&]3c;R O]F)9L20a@;o+.~^R4-`1_WCEcDM&0ǿJlzw09ovG* wԵcDmʊ(StMP[4=Ӝ"[%ǍvEC|ǍJ0T 5LnWbSEBY JDĴ).ӆ(Ewe}䗦EEJ|ѴKk< `5 gEG%?r{.gUQ$NN;(Er(bIXoıW j=51w91YkQ1Z1S{(JrAxۯXYS6iKKD1X c )C\y9<_TZcXuۄwESW[ c> Ԓqzᩖ=Zzbkcq:-(w2P /5~m(_bb)c ϙ59E #65P.C!yB<AއvvD{!4Ʊ$My*޵ th:,k*i>LDu <**JjT`L7)>mYTyQҖ:bj2@%\$N`b^|LrS1u̻w t/R'?=, "FQ7an/QE*uQuVQ$ݱNrJjUq,C]WhAU]5Nޫ ҭN鄷QFYE2@coof;8`_no@Tێc1e)l3C{Ӄ DYdˆk;,ٚ*nϛs?=.=.86h?J^>*?^qˑ99Ƀd#cIM{,w(WbK|9E,d^/ )$ o]{1s ΀\Fc#Pb*cf31DJDmmr\gHnux 读 h(e899;£NhBQ5ҁ:Jep5PV8IISܺ{Pe"n:#xOB47LcVK4/v@$aUf9 Lws p ߃X@NeΡlޢ2֑3IB8fi'O' M6x,gQAw:lc0!":h˅Dqjܮ0bZ&eε#RH[YI*vt5J{=0ΣkGPmxmﶈԏ#l QM|9ZKkR^uUVz8 s3w\lbl^%M!+paS̻3y}GNFL_ļ 6e-$. eȽ`؛_tZz.a)N(8Mۅ,`%վ3kЖl^~6%z_* U>;GvZ\%?Byqr 9G=*0;1NTF93WuQ'$7՞a@h2"OA 5mPZJPX?y2|5KO=A[&?ܕo+arXLdT,4NHAH[7*п_$LAN~Š+@fRydPLSD$ 5Kÿ_In^'%M UvL; )ߔ@Z^J[?F!-&gInn(%2w~ޡqum~Xs$+˘]0TxO湢Dii}M ӰͦRgStV g[Gg"-G*Ӕb4^ *1($bAI 8~tYHo%zƂH5xbΧ<>/!)Ҏ[P<h:5>1*.-w(aPpdvYA= Ɛ']4N]z5Qs]x? !ex=E#jeEcZn$w2-}Z|BϑHRFIsb뤀j8eϭg qU}2Zo h IuףPKaufJO'Q!+C(BEMcjboMμm f13kvm @uQfX;8ODY s,-,-v2K*~NUۭ+rZ ;5=ѷќԂ[]xxmGX"5KT^V. 'qOGVfӌ>ZMٙt8.o#[D߃mlDRPˍ&f:Q]L0Jf\>ލ&.aiY;)uN~On/>zm,(ħ$,yS@y_"\Mfz1,`B*Ae,n5tdhAєX؃6xqTR ]=@I=]|iW!4E8uPa38Sqz֒:%)=GGr~Wdy9/& ~y .w-.G 9ΤQk/萀xAs ġ!/h= ޱ\CbwI cΒd lzTsUk'9*ɝ[$37蚻кM&v]2/z,ni VƯw"i;y-X?byU,omeӘqTsɌA̰L`b)v>4Xm8À8PYAcYTO쀬 _f\| FK*,Ń %LcE"޿ElrmZ~f t8a4`DD r>ڕ!F$ːQkßצtC l䌤/M|\l+~V;5%$F)P߈) 7P`;"TsYH,*^D|}ҋo·jZvryף("iċnWXVX_ .p3UNvNG|J6m!B!d$OW3˰L\j$ZRR" .Tix-]DP6hAQcFQ3{fĒRq&$>& ROI5zumv"s! #Z)01V Et3-úe J|v\ + ttk0cTVD3+&(̌/]0{mE/(W-A;:Cɑ!Fa *rPtbE [`Frda8JWhZ뢀BwnQ88ށrg8Fĭ *tH 8v^_ ;rb? ikJW}7u7@-[xN05T֭;qnr~e3 (h&@egРN>; npˮ`0v9]~YYYer5EUҮ7nt|V| |TrMU&yY:qjhuWN1_@#ߺKs ,puȺy~K`U^*+G@cJcYvRIN<ڦXYfs*ZdDZGMxjdv'GuAD3xdgmd6)a;ϖ qvz11Ksh&]LOR_$\i9Mfoͯw38R9{?ʹndwݬ/ճpׯX > >IZ`ؾYed#%Ԃ_9ŏ^7R_,g+c3JY "ޛͦTe;/rQNOǷolW 6_- Ih33m']8{Ff:>z>lzhbᰜT^q mwom*궼$TҒKhzEp!M*M^2|]\oc!JIO2 IEL}v* .R\Un!Uh uBmppa:$P3T&C3 e8)K1r;4mܓ'պ?߹?Α>EPNA\U9 ދ ttJ1^L*@T)v,w4%L[l!UYB{ñ" Z[D=pCucƈ."K{a!xZn hi|v|L#'"Lô$=i뀙ЖfiD iR!gHT8D.p\8G e&zmUbr[y Df@_ >]"0pWǐD+}jJmҮ9!=qNyF J ~sZXtM=->z$`/ 5#A0{z'NRI2K}sw [Om"p=786İmC_mhs}֠SK0!eLi,z" $-xr.rs e-Vgϑͅ[|vgQ/Yl77ߠi,C/ٳTTtg ՠaW#Β, MgG_CI#HʇNP},4UUz]ɳԳ4g$4s]D=2.yd1e~t?P,LjkVvBr\(4"BaBcy"?dvpVn$$ݰ4y6Ltjt}5.o  /ײWgnpsߙrxSٻYE4]DOF<҃HrIOTX8Q)A'*8cVTsV ] [#{IA#;6YF5WExS, _\gVCX'G9w?T.8tSq~Uim]C*5f3u/ ~5 }w}tַ YW.}bƒZ!^ 0*1nH^׿w6f]'-|K?0u<~UCI͊:h վM*r1+ :HUe[#(_wdp&쥽|jL>͗5md%Ǔ+|R|%M]ʲA 1sѢrsՏÜ  @DXi6/ai*W2u{RfQ+ JiM>k8uRmWEZ pϰ}5t@&'GOOG/cTpKxhFzQL`,8>hKHsWHSZ4OGOJ{HP2po P\*aԵ3&SHM?T;^>KO0>AF-9|A Ή>[9V2dN 6Ѹ54}<$J|rnH'kW?{Vn_v-ݙ)Z3[t(݀kk,KtH{x-ۊe%D^MIlK.C18 /o1Z8AUˆ&= z1)dQv64#S3*nu< -ڀj}QZvw3fxE:ZH@С\`wkC_ ak<(b(8ܗp08:maxp6싓%? z6XvZ\T(b9S 953S/ CǫNgIlU<)_3l31M᭳) | >"seW_,<ԏ'|he[2~NcT}kȝ?ȿc]`QgPH䈕H;Bn? V-䅝[dWi"Ohxhldp0PQ!.(;$ 3 ٪vU"\[>X8Zd$:`6:u;ADKbT\> "N2% yD #mEh e'YO_͗KG@] !kц.y=$5PFkpi$ }TAy! pc F~p!X ckAdZăd]dC18sL ]ϳwBclD~YޔDd=4c*W,VmYAXOS tg@*7 P jgxG a@BSZ bC;(\`UW;ޖEiƐT-6>ZM g7Xxv6r*p#9<- 1X|_KZ ʬLA}PT# {f7"sC58 Y⻬ ps óB6pTT ǵM !';eCAwc*1 .:C"JAh ǃP꾉 M]jưgy,P} ) Y1Fh =~:Ӓ٬"[6OC)&.ZVTG(PsǤ%W{2OiteY?I1JN[< );9eD^E;v~ LwQ;Jjgx ҋܡ=4SA(Ow֍N7bp6Pc΄V{7fƕJ4r%hhBG7T4X`ژGGpba3^APvTfZƋ÷Fk$( R5 :¬#(hp\d0)+%7qQƨ*<낔q)=%4hmnLvn[ С`Y'iq~V"ƨצBG<:y Z1qT*2XMWa֒uNFZ 5Y1X>'lȪ5VdW<祿*  N|8vı] "K}%a#]Q2չP dyh ND+_jK-0CHt75pWJT-۹ yGvZWfHe TD= $/wmғeBC18 ; " 1'*af3% =wy!(7U89uh 9I[~X7 g0WV7c6^|?qYc fMVCRAx>'wIFKľU]{< oxHho/NI8rkI)>:Θ#`Oh7s/Prih N>pz\".%jI𘬣Fs,hUBV\'6'؍Z lzG"䟍@oK}?wᮠ{aC18LUȘAi3WPdɟ C1YΜ1bXa/(DFcp*;^ 0%a\y?TC18J^OƅY4ƨFMu?Әo꫿yAUhp/)ߚQacdnpZj .RamTHW+i;L1B/XyVVI6 \. !*j@9L)11@2];|eb_%ϼ31 & Gxd+1 b.,e)*Fcp*4En:I]], szVɾwĴ4C58a$dԠ`@~)eY;os+DccY1UozHҬ4#y>j# jC̣vЊ,YkgﺂYІtmA/ClVYחoKƤL>밨Y^xƸTPe $Ld%>1^]5*bj2W y75Wv,TN&bVZ r>I1V"p+K3@Wp FwX\5E-2]*dA6:uQ}U^SL% Hf_0? Z/?ރo6Hϒk󸚬4'ӺL~_wi֓l~t@a~:Y?Wd(L{uiE0Ͼ^5Uxjq~qzCl]gJT8;öOvn8Ex;_#}@/X-0V8ѯK~1zuo S#!Eo[GՇw\#mt[ s``x $-¬BoYTp'vrk{.7x2]E_Am_ ZS_%,FZ$:w;n6t0LI*.3r;å,`PF)Ђ4j(Z!&M.%E&O\XBo&?vʆ3 u@hK?Мqi81mu<>鈝,Vch[Ǩ}+ ۧ#N¬u^`)Y/;# #98h' Z L8QXs@8/HN拑tut:_%7z ƛ&̶paWelJ]_2ʮz"@_ K;vkvUϮ,]+*RĮ`R4ۭR Pvf)~Iٕ\#&]b/o& ׍GR(lxpH FRbG >tׯ.~iConJ\o>W(odQ7Q}6m 1ZE%(!AN8:*:5N S0<\]ș HR!(o(n;+1ڿuݫpkng%[J ) dT8n g;Eh=s2yO奖nkrh~rԞρSOunwQJn " 㰃a~M&4|xNdOl WIXMz+%If6y6p<5 5 ?^1ΔP ~a#G0aսwO }o}ʳgǾc9YoW7?WH{U~͏ io&a5Qnd-zRȉ$3&/l'nr_G1F~1|6'fUA\#5m.xʹB{h a+._6ţxٲ2=B4쎆fq= kݽ]N>=bJL+8 Ɲ|zgϞFz8G-tE>)$lFpgm1d9HC>p^ *o&]-0`QC{'r$Oe/Zk?=n1^]vEK7ݙo\bS*5[y,eQd wT02EmL!.ڭI5ϯIE6FˀU*b)2 RGMmt-gY{1q&ϰIw-SA~~G6Ʉn+Θrir8HtZg"G7NG!E\6M3NȜرMzmABH:g&f2b8o۩MtKɟ,CJR R8 i]KҐrLZ5g؜W5gO;KӦ5|G''G[Ss!Jҿ!8}D6Cu"~k*,v_zMwx@7>G۾zí^oKa=+1u;5]_ǻyڬ&lSZs5Qnwo^$ JAWz|)s -'4 xA NǎKu4dBD-RŒ\.>E~i$z6ywuv[e./axs,8?Ė+O2Pa}E*&^RKhSO-/?bEv_hSeB>V՞==n^֩僣x2kb2y=E%rr,Oep]=F[uIxH2Iby-Ԯ| kF[k+Xk$Oy3[dY{ni<jgxezd"sD[-X N$ICi(u;>jWh4s G$B*[CjP:qbʘ:u=PT%m:} 5KժuT |;P:"Pp֎Y4^;撣TB ;=$To*-I8]z E )kwHA3y*vW𠔐t}aȜڠpf.S<摒 -U;PzJqH`X hhځNtH),2"|(ZĔ 9wK%jO$, W\L#D\pwK^hS;sѦ$"k$=%ϓI*ujEk<ɹ4NV ňFȌ=癡Q>jWTU8_iuDv!3:"*}$ToDa/ӂ ) I@5NgCCB ڰ^zcp]r#EF$S&PE#x+hmk/sB Iƕ] IV#GB!b3)@e\J#aw '=>4`Zm 9z]{HR 4'췌ŗE#NcXH^p=[*Zs_plqL_4BrJW:hr*\*E;e#/!/1[xl:١ک:~Vi(Ak'ֵcHpYtXՖB]}$_FuFJ&He;MH(. |-푸iA[)bD8ŃcQHq m-kjOA!>j:4%S'%)ܵ>H*l$985=bGBTpN@[]">Lt@q॒g}E5"?ceL%1G 9(#|GB Hpk@EJJqe'N60 GBfxH˨sԢ[ BDTd1t>j\ΒtxzcJFiTK&7s*u)Hb}W3y & @0}6FMhPGm KA&Ʋë# >G,Ud-e)Db(q{  پdNDΧ糐 &FC^0 # Q$VK (YN\4ӓ| eg8;޸ ~y-K| v7FţtIȢ:1Cq%?8 t$}tE)>:S=7ךixK &!CtR+eBN,K1ΙI"r"ǁg)3qs=Ƶ5i7۫>/|Q٢f}T޲, ?PGz =ڪ(l\m gLVE TVEVZ{ǹJiZ`!?`xJe*C!?`szVkΆCz7\2\2ԆCCCUiᇼuPDqXCšPDq(8Q(E L}"Pl~ R~Vc'N&L3ܺ"IxLogqwZw]3>kۆni]ZChb$'y 2QHr:SRVzE9|ƉuTY5wK i6d@7kgvEwMͷ|4Ld׶&4ѾNœ"(~u.;z$v2~ohS7~J.f_+%rjrxQrKVƁ QbS*VgA+KYY$%wT02>d.J,TE4X`e6j- Wq)$Ҹ xv\Q5u"= R4L R0dӭtf7Мf*kq:'huQ *NQ:YfA WKx$ژ8L*wޞCD8 }=1x(gŮy9<zBQнPԵb_'͵L< E֘'8 Zen\ toUіnsØ+t=GIy:=\MxaCrqOH<1>ަ5E%XӢk*5-Zr;XӚVmqwAn$UL"ILj7z*i4mVZֳ̕)MP iz-^_w:ʳHum+rɭ11U]\k{V=k44~@RJ"%H$%QF2Hэ~S}my+)*wVmw|H2hC4=swZ6W TB]U!.܅rB.JB?tDT8 !w!.܅rBB] !wGo(Ͼ3*&T uBePY'T uBezٝsєjcT;ˌ(Vy$\K8l%K<:r0|W -$AsMPږp!_psdʊRqy0Fh⩕{o6ioR^|xm .‹ 7<dJIIRT!N.{kwwIPUlˋ/vIswʑMO~]Q\%YoZ)dd\~x3fMأngSO}t?+[~yc=k?WJ6JvIv[v>S[u-uvVbCJ.YKGk>ȾvJ{Jܔ'FgX׍ngIV65)pc.'Pđf|%:s-ڋGzWKfVǓ{J=Ib[=rPwQϫ? YW>U?U6ulv:ϸk[&΀W fD"Gћ?AvMTPqiWbyݎug?^Kfin7hX;z8sj[@Eq*x v +:ӋN ҥ( jDaܭ +Dgϧ'uZˈ).y=ti%lɩ敻M~~ȌfglG&kKޢ\|*hVT(tJ8jK ?70MAqm`Hw.3VW;WU0_us-;^ v|8%O䣥8ar8ߓq*X<Իzm!ݶ)f~ѶTw(~Gv݊^* J rHVk_9ÿE|j=~QJ>a_,4Vo+]e FR#XDYrDCv\<Eg'ʸ>7+# J'egmrÍi-4RHeE (2q^/c;U쎣1(q}ˁ_|0Z"$P_K4Ou ~Fw½ȃSTʝq2xA/_E6ґ&C|lZ}`Q &WOzWGL [AlPUhas˔Ct\K 5B@?|ہ%w\(;^ݻ]-ndE.ϗゝhqiv)hAQ^"c-<MP ݤڴ+L{{T[VRߊ…y`xI?LL \f?ҍ F*ȾkwӃauuD]H +Q_K}^uVT$tfewQ&_嶽q?>]Jvn,/qk]zkc0Nc1$<|ʾ}:GI(+[ph"5ba/Q,ptjpCHqLW$%  Bbֆr#uSBTL!ASMlHRΦ+b!f29XЂ1 H18ꌜL.M>psV6&9lXnebo+Ld8VujW%It&bL 嚲)J,RAy;Q]AS&7/#;X ]rjmoNd\ނL/Ov wOThП|.gqOØ"D$,H"$qDbbg lqbl`D4Ҋhw# "̻xyNy5o}\swȳ ]Vpy[?Zc:|X;^+0{<36\gi7[IL]:][2]Vyu9;/jEotv ?s2kEY#WܾLno8g5(5lsQ鎁WX2Rs໮ ylCȨOhEY?n-WN${:ޏ~x ?(W7^DWO^|լK7ϭnQq"cLf2N J 5Lӄ%&a0q&9mט`PIl0) 5 L'PcH5Kek)Ĺ2n~ YCάuZn\ğkp,z",ʱXa~;v@<d4ӓ8~YSԮQ;LȕlGh:I:Xm%= [Ni2vVDH٩}TdE2=&bcbJ=&Y޺wH@dg" f&roy2}|$SUsy?Z/v7<e!+䕝f)Yq*T !$F@8.IV%v`A-${2K*F A4@") GH$# E "Y[2šAlt(CNwXY1zUTЮ= '4pk÷Isya",+~܏fX#%ZއЊfZBߍ3rF:4(Rl9&SaoT7 -1Wvo}ЮSW'PXLK g!^h_aR< _xĹ6~sݚDX7.~/]oh_ qvh z+Ju Xd5W EZͫY[̿c};Ws{q5 `һHɹV>vWJwntI`Y z@^fd'6pܧΕOw FA'!Kڑ筨ERP@K!ʼnE{FRR+,ͷg0S~{M,,nr+]5m-oly mg ca'J(tBPٗ~=& {߁7+LS8ұQD0?#n`DUf]7k} L"!Lӆ(0e,P*42ARRJ(_,Sl4~jEǟB{r|KvGH#i.wLǖI=vhJZ*heF+(B R(v/i,P ۨ0_Z=}#I/;pIya`a}0y5-j{}#dTEn5ŸȈFd8ZTliѰx}gK TOJ(mq-:7!~#i2u45B]a"0^zjKvN^tF;RX:mGA}A@%὇\YҮ o}M* B ͠LaL )&ođ؀y(c]pwb[ap %pg  F8"hĜ#1ͭ@0D18xmZ.AO󍇰`w}jTY: [c:3_ћ!oݢ@?]Q&Ŕ냟om(D(쯅BQ0B`D$4θ . !.2Ϥ{i #DpXEDĺ=`#G#,YyEc0E4^JJ8XNn@`$#=ކ`O;F|Q9a6f67Aߟ,ȧo/L}H:g k!sc=q 5#@xA* _*5z{#M=h>,GZ.\ݲrl\Ⱦa/D?j N8a3H2h-};uiy- ̛' QXXlOWI"%ŽkŪpqIenyzCK (O(*F&˲!D[X  ZFL&ZIAi#ˢ}/vZ_}*h%n.kOX,@^vGEgO> ̂HKB9UaaЙCc_l?fDŽsC@k8pvYޖ l$Z(%ӵQX{{5d>ZRRЭS\-&xSnr g&kopQ#JsJqT5>I8{Pf*YI x/P!`V0/2ZDJGt`ZD"rRyȧҰȅ4ΪTr8LShL> 'H 8kA c}9@AӇXe{C:FHH5pFBb%#1))DXҠ8KFbR>t2kZִiCjڨ5lX0:Lk\G;w#K|eg|yS=!QHE߃T|1U-_,BK*Ljd<)VjʈhA #(H8H ,(7'~7PZ; [mcfe/zNZ~uVӘG"h@QAsER :ԋO3rAWw7^P; +Tl](S18ݺ*'rSulte ]j0Z.koڻl ey{NQL>x4z U8&u2aIB&gqO7;!FN}ql|-]?t4)[ԕX\-YXQe_ϕ7C~H鍭 @3O$KR2Q&U$i 䝇SQF=~4_@L~\x;-E?L*τ!V+%,k@:mYl̴yiUs\+2R>)CJLjʻTT 5(oPH KSv/ "jS1.ǩ,/}ы/+$#0`  nN%d,9FieqHwN'`/NfwE;p[‹O=NθuݿB~@ j`>aA*LlȂikA%cҧ7>ܧHś݋f0=W餳ٲ眡3}!q |^yc6^0nu f:[,a`ܧ΀,d:Rj f(HeM0ܼi[mƐ4XKܰ[b7gwwORqn/?ls(2|ňzΕ - s=@,/aQ) [XcI"iC >f|g vmtNNI'HO]}]q>3л㜱@#K. `1ײ`[[mtsDsqsY;9ѓ hQE?9%xG*[7P&Eʧ87!| %u# r"Í */*%R]?Wv!H5G"=s-~kD{ЍE^0bH噵 85Qs iU r7+̑&QsCq5)ߖ_?1a>K3Ƙ#58#fnBߡ=f~F*7-s c 8yf[*Lg|cf| *g%:3hN%`*~̼=fs6庭51sD z-IER9u /e߶QDq?[7"|ex^)܎f:8CkGI+Y3W0=~WnvӳlȎJ25\bvVٓӛGuBqXt7=PX0taE$XLjA7Uf_F_)!$*8ʝ7\ 1qV{,AcĜE NVF*u/qo L<~&mp͉??;2}kI쬗ϙԈkA[{ˡHYgIAhg0.aA`bavv C:}7+˚y! KܐI*c"}yء-|XK*;ws;E#ƻ{5ɭ}On&sۈ;\vǡo2$5AZX,@^vGEgO> ̂HKB9Uaa)!GQokף*իߛ %ZB))&[)Q[M*RN ,(X Z؋!kWGK\sׇT}ٱυ[-/~@af_n˼TŌ FQk %i_B:`^e.Dc1>O7a iUWpbP A|@N@pւ=rl4 sd:tL+ jbĩ'IN Y$532б#NˬiYӲ iְb^0*ܢZQ^ު~rcKì:k4ר0,M֛W79Ć=+Ȯld@`ܑj`'Ծ##Z>*9Y3hfV̬S:kbe8RlCY'dy`:"]O>t IGjֶ߰Pų}qII4]r;.UO W^T}N,L~!G3~bDqRl_ 8M~3t%Z>W.R~dV[ŻzvGN[S,GUb"x3xQSgJ: tsW3Ju>}l͠E8 =9Nz|E_Þ-7"eį^_uݾׇ[:d -0CD HcZHZ#[Z{/CϹݯӎϻnړ;^?])JZYL>j[H'qw Ъ@[]`j? w1Ӿ 5=]TO@={ֵm={7DZQnL_RTg-zϻejeئeّpo_)z{z|%y3||g擙(҆rg#kͶAPQfPF@nNp]~G=t}_ؐk1{Yߣ>j&h:$:p[PGވN-7nܧn`Ԭ|!oCbưb| [Dz1Q+A5Q oԧwGl(.e¥ ~;`INO r?Oq2OQ{jihTK2(Pe`}/58G_^]W/ J/.$s`&oX%>Bpolq©+z_>cuɯ;F7qB5wwm~:UX`_r  صr oTSM_|#\U,U"ʾD-ndsՈ,~>u\̌p+ 0J::KCBt;9m"ޘDn셛Dm 酚DfܾAsK)\\ޘD}1W@@UI +FG*\K7WJ[4WJ? \%jśDe0q6Wo\1#RfT2ocn_Z8/?@ kJӳ%Q3k {:5!*Z-ÁkC cLC3O$KRCQ`e$i-<D?Jc|5><\ 3 ӂv^rÜQK Ya$ᕌ/ ~h9 \9U@Bj18#x<y2zݲXdF 4^Xe0"E=Za 19B' D8r2()d 8tMӌHLQ#Y)&m.rS%Iڨ(QcFq 57S\@H A'9bjQ2Q:pxN`5Qu$SxLՖ @k x㸷`F i_G x؎8G1VeC7tF_ .bP>\)/!ƅ1L,|rRZa<uDN-}T)/ pc`O\|3*V0xV~BA |,N? WcYz,>3^F`%a;Le&07EAXC!*Hԉ'cud`88%\B6 E)쬁c8(T @ERt㱜m'Txn [er~s+[vsiDZ.KiDZ.KiDZv^)H;wʚECs3]yU^o}.f  Ƙc|z7fڃ$Q -9+R 3C3A= 307a^5q;jaR[! m|,Hr4wT1^ϕ2;m|v3M{vl3:ҘK~f"Wj ^.Q)r̷XN0Zkpc=C*euxB_'G9O=qTQ( Ȣ`;X+\$JKF#uA빟qv>t1sS?OvGv /b{y{ӕ [~KrNpW3=R5$.R}ቋ;b+HIL(mGyYZ^kN:(a6[A&m1ȠLRb^pBe.YMWp~3oRix`,(U0oe Hb1-x,xRpxQ'뾒aF(ÝA,x,OsD; "GN1J'839ԣ)F$+Ošd/|V¯a5tfNtζwsLH-ʢxx~Z($C K)FDoA|ѹQ %ZFaO# Dr(#bNKXS;բSp%T8Tn:اu&k/%%\2NX9Ja818ꌜltPwh4H8S{ hZ,Z2׏yiL3#n]R^^c y-W2xn;.=aRcD`8HeK|VNFoġto=y{bW 67goBўl:64tˎM.p&_RvC`´ ed{Pot#ŵp}(gY~5yz"/s*?'狲8Y{]]h\r?pӦ k.(qCvYiI(E[)r]m}w=Z; %Z* |nGm7bXP83]{-C֮kO>m9nZmm_ (h4:TѹF&hT'o` i0K?19^_^F|FC;r*Ռ/ݤmOK]CοϿ~<e˲dH9YSao -SJ{Z0w_tmJ#vOSglLBJ˗gܠY1bO1<묡[++ +`(#Y^ݘWNJ-k< O ݶ tx Q&sg+wIVn<[Mz[Qn}'W1wW\/?szh_AQvƳ=s9ݓ6p7>*yϬ:  z~ Ô1*1QUTfGێ/j ),;Bfd\ծfSy{YV(E)oG'|u.&D;ۍCװg %I#9N¤p݂1uo—ɸwA,hma2J1%4 2Xe%?Q,9~F3_FgqTQϡc5Bz6* b$A:fTB7S\@hÎiMT'q^Pmx L;սH+8o`CQGUk8A"@5ZX w+WvS樉N5ịUh4BkM`J.6`#*$*:d:ٯq+d2[^j,V^cm]LBk/;yOQ&o+gy&ϒЉ:%VpF `B I a$ː=9fB=eICڠ^b鹎y@]!*jM  !*p1ŝ'+f}frTB5xzzѧ-SΙrޅJAK"wVo&;7]LowUFòGA{y۞v{ͼEhb3sknyHøj?@*뻢{&}Rl=ur,fd7KwG%g)#Vzn7?o7oU >90pd Dl]\@#\~!\ش鷣DPϟRi8闟R :JIazQR&ҷ lp`K=+ݎr(ѫ)\BY .$WvS>IC롫U[[~hM; [>cu2jPM~]TflYi^2?2Ԥ'D0{ͻH~ 5<_|s:Ycv?9adw2}HvAKͧ\|%{-YҖ(K{{Twz!Xeod^FOcœst<&(As(Hg8 G^kjE4[m8R]+ዳպJث[v:r "q2 񘻠 6Ks1AbX'-uڽ8{? l,>:GTt77曶m=yXC?\\rЛBL׿b \|wuٱiup?~,Nf]*-BYPyg~ 7KXsZX+]Օv+x c2j`叚(saM78HJp( '%r~Tzjb].*ڥKY6J3JJ%p*έ!9x`) L$,V,gȳ6j`\nIV5o YWye:Z~.LEwTDTO vn=C}z 0b9.B"cCBMSλ]o-+[ P^[A5{?Eg11-1GcZSwiְ }8F?(VPrĊ| bu:㦣XPh3{V>֟ K'V1(Jp_RE%#v4NކSEc2&gƀN%eLD&$S#FkuN_,±뿊BO إoR}Di%W(d@KG>;h >&SL a-V"'n,3(P B(Y|BGuOfg30J8& ӰWQI|Ty棡N.#;XB%~#ق2+rjrTk},8@F j]o='<$R|ݵЎ  q3->"rZ#1Ϩ-rhQ1B;yȻLJm{qŃ?Wh<ڜtQm@d>Gd4{܊\SnW8Em,+J⴨Bq> kF`XΆWb⯼5VҴ1lV{g#캅/Dn;cǎNĎ|ΏƎ&z9Ip4@e1j$*-mu(D1J8 \II`E KI&LO9mY|ȉMvڛ,31|BkK%=5~xmN3륩f A+y0Rb*Pf LJNH:t[xF4MoEÆCM^r׻ %Yo͉7úU bRVeq UYc;on2 V5v֣x2g/5'/}iM?3}z xxs| ʩcAJδTqxLhM T&Bh5D,G)A2_Mgӿsۏ9g]|j;v6-.`nnv%9Fwj os^<˟'nMoS6-h6Q Ujyg ѫ_:H6(Jf_d4؀0?`s4ܡ}'_~r Z~o``A[#̊_h3o383~3{a|G~/&{njz{9po@)nෘ_7dMr@XO0i՝_쨷| oO{fpY{đy apm;) dkMr+ІgoG0>"!$L4瞣xO [?ٛ(\zA09y@{^j!zx7/qh?ur;v3~kv'U}68q!Cs+0bFk #+!g~gq) W\JU\L&Hif)•T'O yմ&vqvPu P%Gς_zA<ܙQ%}Y*Rﷆq  tޅ|揽hk 9ߘ+jwQfvl[m2Ɇv[dp'Yn磔f5e2v)+,-7羢RJ֭hF;<. cT&T>2 3KSsʏvj኶.7?<#aDv2EGNhT*2BM@8r`)a!Iha"aC!AJ4I9BL6Rb$Yy3h TQYvsCcfV Lki@"fhLc$@p Sq.(=dS! ç>x_M4+c;%& - (4Xr@EaIHI$q371 )H0@xT:H6K_R^Խ°"0T8V)Ό AŮ;-uϮ>#vȲ_mقck4 E4 dHoZHb x8HUh&iQD5&|^gjκ?kU֧o[y| ˩1m|Bo!_\ueʧ/̔~m.1 ?o  '(v{CVk3i% \nޭݻ!VV@3V' Kr&jAeD6ƣs))9ӞSE11]6 T&B iEkk6iv;sޱǯoa  0`|ښRInIUE J,J=G<[I"ɈxxneoPvh(Xj#GUV[\.it*Ř@- Ub8 q6SrU-mfbfJղeI,)eY}QWmjT6*=n=aǛ`tԛ<}Lw9;|?ۻ>?9pYK?Rr)\ІHP7Y1AxGU"N;x[sەMgf"}2.ywOqV :oO#8mJqZ-lm><> Zf؀BamywoLm^J*lZֆv tH*H&ِy{d7G  JX1wY𯓞80bݻG9/>;yju*ibDE,HHeM!&TQ2΅H©I@e$gSG' ۵nɫEjoș{Dl|?9& FPF4"eV3幏o\h%WXnV4 RȨlKY[xg8*r"[ϜlGWF_D t~#/Af{YEe=yhgrcKnRv*J& 1OT{u-fGy.,%(at\`R 3BDN H7M~=6 "HAOf";4IJ4Pν5hŢo2DHG4ςO!Z橏Pnk T:B #g4>oʳ' &Ed=J69GwW; Մ.W8ĝ)wW>:BQ(J!$H>DQ S'z@Y3N9!!I%GKQMJkb֌QJј`VYJ ׼@74gW<Ox\ ٜ,k.$ *>^6`d1ޅ8;fAiꁷ#JHXRYtjn<*,2%1"օ$h$*kKH+_*coHƌHGNM򠀥,H.8!(Q{VQ9ꯄ8ѫUIvNI봄A( 6 cmTE+EI9e |NI\YRc 6h "*NY +9%\HQuFNL0+f!'8ę[1P L0DxG30+1lW&g7zU:vs,J0QcWiMNv6 )[sPz~*/R ضss:H:,ǍWo(&4ZQ6HGqfI S)Q˺Uptmpib%Q@ӐL B8#f=HcI&œOQSYQKH!7 5A)΄D8*yΒj}e P'UhP4wy>m az`_ riL3fL"PJFʠU º@B(P52v֡ oᝣ7bfj4,ّ^\ blվ}xRn3OXB{FJTLEўa)ɍps;!7W\,Nyh Y!u٫Ksj! \pwY@_~%I^ڒ |^xf~m+㶗yܛ6{3kOCZ41y|{E(0KMx9Gƽ ȺwZ5910ǵTˢ#?fi6%rdn*U0'kuxeBqaxq Ux˫S K{vVҰv@dݶ2=x8Zܚ jMr;:%{}ϒBgt|**Wn>>P4NOag9ߛ>3m.b34O~l#et0sTx&zǾ凞Nq0JKu&:Xv66}y I7|[qR7 k.Zż^U"7!\ Lĕxj5H8{@К7S<oJ`z|~At+Y 9uu RLL0]i3Upm26c͘VR)!uk:2\vE]ejuuw2J@U&x̭+$Vu72Kq꺺{.=)wD&0/W~]uuB\G]F1cxUW.=ՠ5tH]!ΨL.讨L2VuhZL&QW\uŴ& n]]e*owUE7Ds!u 63 5132]KuU:됺Bٹ+U&WӮ+Sr*SLUWP] MA[^z"|w/9Om:er^uJqrr~sN,$XAuS} atX!.WJiS(rǸN'|5MDkQ^j%;$KJ$7'M!B->"}=21~}D;XLH XJ-N b9wK˿|m0D2DareML":GmrB H("q=aʿS@\4k*qނkfOe^:yJO%I8'y)UM[\P L!pOF/MJ;ɩ%^D!&X@ /DEc7#)|8jy]^?}n>+?Kk靣?ahB%ZFOp<s3{F C3=K4F4A =$\8 Mi̊hXQkeGzx)´y 2x.K>[ƶtL$զ\EY!.a?SŻ0gf8ผ?iOwwI94ٺ`4h{ʴ!lˬ2k/+ޘ[{!,C-LjG{mt֭>m}>(K% x.C$Wθ 3ܺ0S = ;1K[a?>[ӎȳKsEHigy_)6Bj8 Z9̩ 6LV1X`(A +H4; D锨ăCx ,<,l7` +iߓ.з$aA ~rV -{2e Fp}4;.L" 5Eؐ\ɦMUFZ|{Uzk~ԱͶ"+!-"gCZxed'$sظ}8j? VP‚G`Ŵku! :S! kQXfISVI$(9V0:GY0Am8Uh-2΅H©I@e$gSG'Thm1rv]B'֭]]|4pR5oGy4?+v" p 6o\ h&.}`q @VX4Gd`TFi5. u6@..š9;8!#ۼ\לMpZGk B^Z#ӈW1WvM2GÌ7(ȈVrEEP jfEC $WϨlK{qCx.ԝ$Wњ}*r"[ϜlGׁ/][g!'%'WVDu@}Fm'AQq*<,\vV<\f!ZCp@ Thx px&xER8lW&g7zF@X ?[bRxa否(27^+yvn] Q)ZFG+q8IGh"EǙm$%LhDuUptmpib%QPx4ֱ$a'yƍ3D8ޖhPTsr~iދeKc1C텽Mxyb&Cr%#e*a]zce(WJ;JP1ho35Z.\kw v8+(vـo}s5Fudv=RK J%k}wX_Aq\3;栂a)"Y03DjZW\o_s"K"ù[bڻ" kKtˏy~}5F90s]|)rު ΛSx.,U6F)Ө551sDФF%Sȕ{6*yr[gtCR.Bo'Px11݉ z_M RkG"'&i{l\8Ewih2c( I{-#X62DHG4ςO!Z橏PnC<5PDpJK"g9oQv =~?*R/'+dSNj!В]6u0}lݬJhy' !ON6[Oʆp-TT+>8~Q(J!$H UOkf 6rM J֡PI֌٭bq.du!pAQ~|r59= jBoc?u<ޕ6r+"Hۼݗl{@y7ZToގN:8<;$N '[$8cNHce"d$M"Xa8>T]ƹPm:{Yu> B_vή'r̓݁j4ji {FbPxN8'QjV( 1%4 B~ev)(s3+1F!=V1 3*Ɓ7S\@hÎةn{4;ћt۫m:{]-Y ʏ\Ӏow`9JD1b -$-ZZ{/C-3Cd)&Q3k {:A Q)Dqp[hOL^߃ydmϤ 7얮]xKLFjgD]16I~~k tnUj]ԭ;tͯkl.omn7MRύp$m{k񷾿vw ge|I*/LʋWZ]\,>bT7PVW`~9U"W\1Uh< #8MV^+_r]%j);wq۱@Jyi֝j{z;sA>>5LT|V3@h"Z8{96#zUSHa,:uqk6(w܁iߕ7w_p. &cx5+˧K|+1|jBZ^j\SW?W<E,!]Xnu OeAjyX9[d%ZX@ F/pJÃ&W(%Etpş^ehBrأp싵)IQA)3ZX\&zG`SAV6:]ũ4?_~jU&BE:K ar j2~F04K4gP𻅌o*j{fȧ&*,>,wWߛIJkf=K趚*aL$a=9_,Lf9pФx0(52K D za)E,Z*ҲFZ>{^R{<6 ZDUTb8RJ(IiF}N1Q`NJ9e^31F㌼qWєUzy8S;7PQa9se'D ^L!B(*04>ȓ|!I̝+-gW3'2n2nr us #ʵ3.egd@ffiY(Q 7E.VpsCvv!F]&ir$;,b:mknʖGcױO01%pݑIh+- `-W#eeAM#ddn6|gn\ʚMli{sC];ݪJsu"LrPvۛ.ʅiQeq8x;:8lo0Y c`͇r֊i)0qZFk)A@(GgtFΎ 3>?3z=$Jf沰}[կP2=u{fY^hauٞoœ3e(E >!1 PaQPg<-!cth @%:|Qc 6:@}MZב 4:&(fEwZI% "B'H&ٝqb(`*KGn9 X'N ]? V J %ÚQ'=S/Jk Br X$ eni 'tYAABD ƣޓ`5wG,I 45>[ t+!x/#?q>k$gU*{ cUvL^(t6|ڝXU.Zw;yOGto^,X&L$ezf{:>K0hu?.v;1֢>6M>񨺮qW!Vhm-Hh֣}|}E$tJPa}C һMlIqQ@ظC8Mc局JŪ7Sľ,H(ӹ lt5.[)(=ڔexr {uy@W0 8@+H A*&#̬ w6keV0_z~=ioM%դ}n?ǧOYf:(cЪSI-rQo8f;Øv>i8. L&:ӘG"h֩Q4wFm6*GT KI *:d:FpglGJ=8q+[,6EmJ+ey"R1T疥7Ղ$Aٿcb#H5X Dva8$z#z<{S!O'(9;2ڍ"Q4N9b5TcB)7jUQ( Ȣ`;X+\$JKPDc׹:#g x!߉ w"ec-}g'՛O`J\|dD*֌w?]7C+dMK ¹` 8ct2tj/9vR0|%gS3<++Н59?deNAn!r\B퀭1;Ĭ+Q  jRz\T`MrXeK.ErUL_\ޅ>ǁ ( |z)K{W) \z/Xr~[zm~G_k"")O:n06bS[腒 {[,P]\b Pt'k;:vǎs۳g;:DR)|L=yx\ Z`FgjTǁTN@q0HŪ3é6Q3KԀl( o.G{2M I( $hgeTL#1Տ62Nt8+gMASc崋PR!DD АR(%~qJ(hG/eWCA( oZa0),-(%/ %,")99Q(q;EAsH(軽tP>+r*Ht|&SRHhأ##ʢyiiEb@~mPXGt M'݃c#E a5,TgYM!4#؄!bP1YҾZA6e2@ie * $J0.iUȐvuFvp:q|sZEO>5ucV;O6,Z\5~mL3QK9{@+!{k3:&,Le,Kׂ$AC̽v"s4ѴFÛ~ɖ^bWGmoǾZPQ3`mY`v6UÛOi H٠-lPJyĽa@|5JuQc}*VVOJ'w `^cg16ۖxApR) Ce:99^JBWi?/[tZ'mI*[(H _0zTD1)TeT|0kq%Tu̶ҲPLA%"ˍShD6>늜"JnbPw3>٫*zyS* ]i;Nv``+H]̥֓O,6y*bFzd[']6T+9EMm¢U%LhX|T9PRSj Jk95((3WuuAuJ]u!o>ܱ$̓EY& 7WJRX6䤬 .J09NhF]!+cTZUKg}@_xo *za◇wEvoFyɸ<w쫵uZ[ZG]b )b 6 l&M@&"2cˈɢr!t|.^z[1Rl"Zbف*Ub ,Y4aJl6)p:8 B6KZ/i"}:VEmJ?+)XIyoqUPhffgŗ/o4Plj5N*QO8\}k)GI#c"(%z:"9Vwylm+}1e^]hRI&KTΐR`hXL B1i8'>588}_!I,ً ;;#gLhBJUjxkm+ 'V?ގk7Mg>Tx<|aYVBrC Z icv]_"!(`m  GRكKb1$/:?hnRfeXI9KL%Cڻ(!@A&ytR: ,dPm 23%h ;OHX[!-?Qf]* UH^3\,LYfG֙,IEճQuzni@Q;钑 RdŸMl2XbE!頋 hzAV u^$%K)Z-HDRBPR9Eo1E:EJbʑ2fCwvz}cQޡ0_Y|ИX{1QU@ې~釛-Xj؛+R݉.$mU}}BIB[qqfm.dBqb2=v/2VZ`T@50vH+`z~oU(i4mW@_g7 ^*sZܶh'_[揷odpRkVsn:աjoJN4 3 \ʩoeÞ]˜~=PC k&# J;3;@T@&+p|R<&> %0H@e:3&*l Dх'R{;Ձݴ_s.(r.sM\TZˋ[T6OY;W!_Oğ:v{E{B׫:0X:b },b]c ?JC:(Tp^:X#Tdn??6uZ>k`||޽z|MH-mnCBOw]{ &Ob9ݹuíͬt&[:]NqͶCnpw~CϫR˧a|sD>{.<]wz 03FrÿoCtCyU6s;*|.Okls8ZL0*q6+ZKJ-|J}?``4Nޕ9z|Ȇf:p5&+1614=4W&j6׻{Wcj;&gz6>;kē1K~g98X|.Gy]ɢ1,$has32E-r㝁&!`+ܽ3}2ҳtZl?GO:$ A#JČ^x2Ȣ#9PSg+0 ٛ(]&}vMVHQ~|N0G& ܮZY׫`Lw-MK?[_FG7Gn/TCKܓSk;.j-ԖKJ'X}%c 603L@{'(xVٸudf位 V,qd}uHxe,-{I!n?aLn_Vjo-oiCC[4 +rS Vd~j7Vw75zśي |P˜k[1,ll4Y~۫6#=F<9b.\m@cZ6i2+f ƣSc:,f'YY?`e٦pI=X0BҢȬуբi+5b6ɠkLb{^3?`߿sc~C?Cj=*l& eaKtL78}0ZelFa2tJƣYʐ1GU I"$OaMEȂ6eq!ƬR֪PP6}SͦiQix|c hrGEdfy͕>;ۚBlFir1 QZ@!.أԢ^s9\_ߍRmRI9*m]1BJ Z1'itd$eȁ7$!{If-2KHְ0C-dnfYLHC\[Jm^6 %u$lh g}f%Q0d3Q !cnYm⴫QAi&; JXxX]W'f2:RЕ ,ٚ$x@*=[BV$lujxaO}iz; l YYThWyYXY\YK'KyoX.~wY*I'(*?{׶H\O6e^ 00yG#iEMII"%uJ"[LŪ'[+DC))IVDf*HcG4j,I:ŢKSch5ST:8S׾""7%}^$\ @ БBvP=5 }m.5ˎaJ$z-/VQq S4x&xؽ*؀d@> `-iЮltG*C dr(j]v H2!Znkk#ѕ:Iz;C.2T(ʈ>bM=A%xB>hAjx1P _@1NmrYwk)f*HcJɡUf}J`@;뒄 X ZSti ja3^\0##Y ljr(bRs>jE#Wnӥ!0r(:ff5ФUFCtO]XztQՒ-0 Uyۈ'3vEC@L ##B0@5Kh!۵jCa,z=ȉB&l{ʍFֺ[pW@:~%wXUw]Y_ͦܪ~+?^cWS9q5̵hհ˃Wz(FPWnZXQqѴ6P@=Ț>_ܧjAWlSO8 ĩ+&^VWY:^[:ujxC+Re5UWcRMXMmޥ5`GD!} ^o̊)g=^kخIw~r%Z3^ۦEo$E,f7Y͢o}E,f7Y͢o}E,f7Y͢o}E,f7Y͢o}E,f7Y͢o}E,f7Y͢+#pL@ G sM³#|0o6LԮ+`7MńŽbDa@2Ƴzu_{}&,~{x{{n^ٞۛŖ5o"XYs͚k\5׬f5kYs͚k\5׬f5kYs͚k\5׬f5kYs͚k\5׬f5kYs͚k\5׬f5kYs͚k\5׬~kThi#p\NJ]sMV*h]j%|1Ǔlf1Sv93yغzLnR!hV Ml"Clj|=s/to_/kҳw<w\LA&ժG]hKo):q8@U5قZy2*76瞶z~"(zcɉ/o?y3>YT<{K;o|Z3/vo|1Co G=^2}Yydz.ۋ Hռ4Y"hAE,f4Y"hAE,f4Y"hAE,f4Y"hAE,f4Y"hAE,f4Y"hAUp1WֳrS'l.Pz=jjk#֫"ܲ/,2uOgطqmSJ-柔%GUϊ[ꭥ%Z$4SrzIa)U$x\k!m)':vWK1UT?I~$ͥVG|bow^HGvwvDž7v}u?[v}>XFSK1فOفQ|-A{p_&rL,Fei")hNd4erHDLrgqh˝gQi3{ga2-YՕGlI´*QVIע0 R։Lŋ Iz(#y #f~F Bx[Cvt˕V$g%>M[~yR8ߩ_ogoqd2Fx2r+níxV<܊[p+níxV<܊[p+níxV<܊[p+níxV<܊[p+níxV<܊[p+níxV<åK?팖n7ׯ?]ZպK)wHdidVy%G s<Y75eݰZuGY/&S>ntr?Uw)&Ӓ$0-X֪&_6GUBnHJJHYuVn{baWu#'"B}er_LtHI9[λIΫr v*Q\ H]2cK3i>䜒r 59A:1fv&jdqgCOjzԏLZҺBQ_'Q3-EɆV;t7*sVZQjfWtOؕDiQ2d>IٟY ys5gmF#RUΧ +@4# &SM>ABXjB)z0|SRccM}oͼHHH{H;{Y¾\+}2EGMYz1\5+X+EtXٴ-^|F.F9rA8)+&2u8^^(Ƀ{ֹP8ݶ@tz%el]TߖaCz3d˫@ˇ:@o/pvFtF::j|>'yzE<>n}@_N>|p~Crez豐˫?rxf#qpdܼ )sff-KO8|@`tg[}l"|;/:27׈%o5GL\}|x堸xw ӌK?z(;TJ:6@'9ڎq_LK,wHY8gmqڸaL$QJZ-?w-m@xÎY ? {Erq㰳Şs=..OCl{\&]Q{Ϣ]V,VnV #?/v龵t q:i ٻFU++&9 _.SX'VmEyE̼eBP zIՍ=:"SVA"B E錳>|E="ܩRFn`TB YL#AA4u3L2`J"P-kQ`Z)mȣUx/'i#3Go 1_ 9k|DSsj;:78L,u6+\^R~6WȧolG; ]wG i͘ ͸p[TR*8OQ6Z}Jp+1If>xݣ].&Z~{i8Rla~[N s P7@ :vCƵ6*b"AdTS9oˬ}Pr06N2Zx ;c!#Ju͂NgӸɵ.^ҿk ^ו{1޷NyucX%f5g/N^Ԗ#95:vQx" a M/5Q+^uU0L t*R%"Ē+LL/*Hg9PBKk py(TB8K{E锨G#,Jrk)8@ĵz] ߋiY|Z.Hٶyoi&Ju kWDBx= H9UYSBw4Իxu@oWZڒ- ߙm^ٶHDHe{ 4^*)%(v#2ʃq0))k!h˾Q^!!$fKN,1S$3dpD@J0ɨKDB4l $8񢑫G9'^m~J'puGbß5FzG5N~i{To҉*1cP /r8$GZU'xjѩZzB+KhӒ T0γa>$ƭj[i2q88lAQnXӊ83&o}/woiF, ol"PH"FCGh:2l-% *AA_[-¼ zJ ijaM!o `}s"F(gء.eZ,Фt퇦 (FapY!(& 8V' @ eC-@r!hS"u;]>Оn] i1'6o2Cٶ^}`CR 1N mL 1B)oma "2?KXGьKn&&>~+, U( ^ulA?MRtr.բEosDqS6T7MI{{XT*ow f *-ИD @@ڀ|$Tlhy Ne)=qȊ1qP@0H5YtMʥ *8~RBd,q3J9,,FBVXVUVmk|ﴊiGOЄɀz^w4%v&9)7 0SZD"LH(@ 'dtx͉\qSI6dEF,6A D̢<@s JTORbl~ЏsHbԱ慥6ov`7Q?$vCh!#2A@bLPQF(#ɐ salQ?=M͂\$b1W"Q%5hf,|P`B@# ͝"B5%k޵崖Ξ5)fEbW5zQ[=gk&ǸuDm ىdt$=4~l.m7?FϮ!QICuWY ּ5.TO5?Ԥiob3LKЯnOK<*% i!ĩ\Bl`*Qfs3D( -K5`O7*LZNmx۽DHs9M^I+rex{~92̜,k図䛣% A?&r$1oqXNɈtkJfi֬VM b˗o5n9sg2sɲn9t _]Zxcxv <U{P;Yn[ԫTjibm yr Qƭ;')o%*m"o(揸7Q'bͳh˜cSv?7,Zg6&nófQ{Kl2wcf2FfsOh/x5픅Js:/:i׾k~٤lf\۬u|5(jԞ8y 9N F<3BHN`Tx()D1q$%#3.qm#:C[H]L!5)3S6Υ ݿkAtc,{2 ?QVZY}-ooo"SA c~9/ f_zߏ(r<㇟_nE/' O-X/2Ϲ[Qo?A5ȫJjC`|QL+ABZ)/,gW5xa  P B%![šŨc'dg4H=DTˆF=?Uɥk(љL~i~EMv&)Oð?+?ؙCPupRsgwE4xΖ?\vOf8)4UYrjB̡\(EeH*TVIYyM̠ͨ+i9of1oW;G]uR:=V7~>;Z; m`=D05',7^ 'ϴ)CcTHB6DAS\SD.4n 3KP(-ǠrvLe4T;D^tCw~ k9=}Wdj]>xd]Z;X=MK(qNd>asQ( .,$8m=c!hd !$/uP(?^>V <J" N$c hJ5r8dP\fBmٝ^ˏ5A!^C&hwsaRx.Yomaq-'aC;.f O_F|A/Uu?柽ݙ<>O;|&.;F}x|/f|\(E-<ՎBgYmT<!5xhR,=OHA}\JSI#^zƿ-Ȭd6Ľ4t>^kw wRsL]̓ݻx7ay,WZ͜Oo$~_P\PU> ]ܓk8JpF#󽧋ulܛo+8(AT{D8 ^.%X>k7ּ2SB\>u~أjb[Afw\Gԛq<}\]m$qo kQ 55Q6mфVe>x;҉h^O(|j{5G-=|ơ{>F Cuzu˴" C**S Y׼v3d^d$ڋt*_ >IGaJKGhrKh "!EiZ;́TW5|9GXNi@ZDGKtn>R _nbQZ6Du|??"^Wjy5onzUX>Y:I=C5NK#^%Nt2 aܝZ/, :+JJ9o5eD6gRFt;V}ֲ_Qʍ Ixa[HVx5ըf$cZfu`ʹEyU1tΩ{6mz~ߨ-S7Zx˕^ƥM?U.y_ft!e9z֛GWw3^/҉σK6<7r3=  MnOo~ߩ{6o8:C49G03Ҝ>tAf '~RߧHmls ) Fv|K /T:y_9c\%+ѹenmvom35-Rtd[gB+A8#YbQtQ 4G+强`#7֬9hoGvhƇX.w#L۵nZrOAAY5^b1jTڠ( wVTY392g{8W蟋].-d/5E2$e[Y_<LjZ4"[fwuwUjlrD`p ۀPE.9%x%zr.@gd嫷7YmҔ\oW3Y)Hk|2-\Hz-T-^a~ 0i8Ž_]%99uTUVcWWJzubԕh)*$~vu$%FϬF.yfu4j~u4*'Q+u%zuc%'`ɨD.姢U{uzj}hߩ"#"OF]%r=uUR5+J)lSxHqV4|<=7*^w &`\zK.i ayΠFr}n462+_ݿ^sA<FJ@1VygklXvܰ?Ucƣx0(BݷmWYpc`bҹVpb#VoiiVKXtWMlCMh?6GYE,.~;ѿ̋Klz,B&yb U@j״w?d%QGÞdقx&(hsrxhYpuptf$ew'Km؈HVbU gB)8f7_ cyNcg_ ~{ejvMQٶIw@:dClTt^v:GbKSw*}!JmJT ^-JV]GOanJM ׂC6s\X@ld![<ltEN Sn|N'ppw4U'N'ހ/{^=QRabEE37ehݴE+ELie]Y>[ x BÊCٓ?[duЖ'aWe*J' _NzxA{rVnbuy$k .ܱ SM1JSBpȔX9DKd;lq1fC j)"{"ҦDDq5Y)"RbmEd%]-"\eE`w]!s@BW ƜӮ\OYnqxGU S|JӉ+Ojt*O@->#XO0i:!u>ts*+䩨DbǮ+.(bo;mff|w; :ZQ8 ,LL{~McV|j Kltڼf V%׼JcQq e-u=B{Ƞϫw}-'u%?;c)"=c5k6mܝMOȢ99&ԩX4CyߢIT*[4Тӆ}I\qĔ.?ȾۏMlk5UEΒhЦ`+inaĐBa೬|ǁ] 5Hy.8 EE(Ic0v]k02k2xbq<,vu͌/ %og)t-<snߞ\"{w؄gߙI커{E&:E?8jT_EVGguy^J! 5drF 3Ͽ27rW}"JdvTj44x85~5nWW {,e.чK7]Kt~6~}՚0]x)E޻VQ=~_+qDާ'ǽڥj<5 vM麪MynsRɇAb{eT>uݦersr%T8 yC)RReŜ(aHr^@s`3aEUnuohA?S*+Vl>·/_DrY^2WHf/`>`.Yֽg^(P;zx5,IG6tv v;uDvtvLWw}Vr_V-{MJnګ;M q$%4Í^MR&4 3.>0 |Ž8i;4ͶfZ3BIJ!v0`aV{3h oclfEx q5`œy[8]}튾8|);PN E7ގ'F_\`44[҈mY}yqA7k#v6IN§HpƜo#ŭ'D*3aé{(Ǫxv .5淓zvyJ22u/u4t:LyGc70 oƌhߓUyǰuU ߪFOT&o_~Yv Kzއ^?#3,/:dcWug)CVL鼾Ugh>ߝ/]J}t^.. y?|%RGxxA83GQvmfY<0]g Yqdսa}S#3'!ƎL6-tj_|0xk^^<7>`ʄ,ܰEr-8ϭĄxz}%#} D1U\ #?8^C2O5QvQD#Rx)OiiTYwQ-< "FوD (Wm%6!C2 ǨUڥ"0,<נR5c9-c|'_]F<[aB[DpʗIxǵyOb >=u`X-z4p.]adNJÙ2"̉vSӈSr e8pіFfҡ-2o:|mʢf2mxZ;"E8IE$s2JaQiƝVc  B' Ll7cPM#6] X+PaϨ^yMj5 ! G,Y?\KMmYłK1Ŕ{.`mGG [A)59+(6A( {Oq$(nNyKGj}s$gE&3wSV}nY,yl1 Fu>Jnd>.3ٟCO,YPRR&mfF>|"IL環-J9󲟔7퓾c<:X2/f1-Yɸ-X~4"IݬK=1ZhoC_$N.պS%0 bw}HM oRDs;O+^b;܉Vbα*"t~%mln|63< u^=V AN[`*H KVgVs_}5 eXxR9ql{TҦ_$B/-EYf>5zm?N)9]C`K9`^68y]b/2i׿"Lopp'P̝`u"i[%<产g'WOVHHA`ȧRA$%`WDeS9ʙ@>B|~?F! 20鍐4 otHzoZxi穔 NGO  C=`ǫZ\YսyqYք,M:<~܇~2t=ȿBoZnm<9msWkჿXpÙw`;ᜌF)3Y4)I~]ғhQQINEI$8<OYp̢;fcZ?39OΥw@)$0ephͮZ08. L&:ڭịUh4BIA͝QFGRlB2Kpgl8q2ųrd9)JSr][eYrI %MKu2eIJYRVa~eIeI_aYRIhTGam^gmVɩ:ݰϴ˲Q/9T#JRXӋ`>G>ec0hts~q1{ROw&TLǟ/'}S5xd$p ^ f5\0tȁ{գϼHh_ :Q׵cd '2!ann+ZFoOP<]_?J_\>85ToMzYd6{Ԉ΃ :o#L3p$1Ukɪ'e(XIv`w4hf$0QFtE8ŀDsB'骅)2˄ǖV!MnֳدG?8mo>/}*%ƥy4cnv\P$; 20F}ǭ1߫|S6]z;Hyk;F Z B#h.@[9;E3m{@_ĀD:^E Ciʢa#]eUN#z:\Vif}Q(GUl.p=:iqeZ7vP.<}#T GH׮5HvA ߕL>Sm/c#Y0*qDIf($##h}Yg&B̥NF!]n@:&3K<kqBPRhόHHh\"{>ް&frD8>֤haE5ᇮqLPL10i8?{\i&uz2 Dԁ:YIxQ$8Τ%@XV19{NhzixɆrpX쒗mͱrFH`Χ^s)WpZ*.>͔VAad6 :ʹygu]"Ј>Lk)j{j?C@|o|Wëm60eif,U\0,&R8Xt&e[ƹQ@[0dETJxЋ#B 1B*i-pT 0kGgq2P?nnJ|o$#֞ FWw-9mZ0D̐P2 W6ɂr5ie$ AHQ`Sj5 Gx@. pNgHjgg3b0=p.OiǶ-+l:Gg36EXpYLe!$NP !J&OҦnP[Vl41CEG&!ZQq$ Q%V(_d5q6a/< 0 "Vӏm*#"4DlN#G %CL2d^]@-y62u9_ ׍t3F|cT y1=N" )D Qs#H#KCʈXMgD ./,uJ{մd[\ԕqQ5\lxҐ=x=]P|LGݍKgbr/gm7Gyh#E#KiZ,_x>^xbVN1V3£JbǵIdAuRgg}uۿ"zp39܌0\%nK\#"tAHG|Ȭ#BI\cr>G۪mz} ~Ʋ.}ɪEӖCw:3G2= بYyL ipz4)tVD:_ {Ah@_tZ 8ȶa-aſ[|C6bf~;7ϡ݃\/?y =pOG ߗ$'&vwgO.xT*qczރXd1ڤzauJio5^Ӹd!tY@9 G.02m^a 'z뉾'#=kMп*>ά yGr43nU :&.Tv5WYwKOͧ!{ ߨޑc-NOɉX^eI)CGKO+U}NQF+zHw:{Wq\mCj?=\l:rV7. t~@CJFyD( ]At,ߠ# ځb=1kCjU;~PȽMd+0҇pABEH.YO]0-CJc:)]!x7!ͼnN_xBJQԠEQ#Q3>Rh2^!@` h}4( 䢱֖mY̴Ka&A(5:1 4 aQKѰukm<#}8kH ]GLoZ}|'ݼA)QxD6.cGKrMO IY]w3a: 5rtlr< ƔQ,O2KF1& AG:3bvQ-o}'LNv֤i?T%fKq"@Z i|H1oq*O?aa6/ՠ K^xCG'f+:W?$rn ~w 8ݤTsJ)E/f$+r0v/Kܹz3/^ܣStǣe SS:t٥Z},lO5:wvF+A sˋ7C_]^A-^̂;p2ߗr1[ngE.J F}y˟.H/>nwGpy/oG03+}j5IfC #;wVg*j.ua\ګG)?jG[ߵ6}uyNnJL* KkCL )@.1Od.s*~5A%7Y7>9"w5662ikC\fƒDaY(Ĭ5yƢ3)Ȭsu'Ӂk2"l%}V|JE!!*cݙ8Mߎq#ScVMOc+1cYcNzCrV +}[utPlK<,U2nv%d)K.z0bdk.1\`\KL U׳`a5 Ee, K =֊WJ+=&'q.`08:|#v46{9Ad( 3d&̂@ Fnh`2!EMM4$#{̺,!8ITF#SK͈F<Ԯۢڲvc4̀B:btqTB2Rqa*:nP[V,!r /:J45 Y#adt8wA+ Ƕ"6D\֑#E&@2|/Y ˖D|:/us^{F ŸNil2. O~u ZUd=n_:mZ7-tբ\ Dчa Zq?#H B Mm`ӈԮ6jQ}?jT?Đsw .?}t۷{̾:X4'Oߥ!I0I>^d %!ę3N mb\ȿG B}Wa}۬o_]\TY]'v~?y8/ӏOϹ򬛅#V3PtyyaZ]j,;e)$7-iU, ]Xb󋾾& N m7{nuԓ~|Z&B+RwK3:9@Fё̗H4_N:N '|FPՇ=f/ܳHлE ϊ _Y7 ݌I3KTd&vYǁ[|ԥJ"D v׶aY }̶ݰI{?㭴qRao/'<FDL+R**$a}F"JV\Jq17ۛ) TK7N(߾^xEtXu1<{uu ?Rt /vZ;7259塃cR]0\dzMʝr 1$23}]0pLs,z:;d3-"e+ u.+tA&!>dIT )(e( b_ކ`:D,q[="aK M`]- uB8cvz5q|'_wyY$-48j89tOtgԲ,ϩaW`+aH]h޵q$2?"fe}~ʊ)P*AU )ɑ4Hk؂8͚TWшSr"ׂem qj&ڡsWd>(J_ ;6Ĝ)0itL*"sNaQiƝVc  (B' Lc(0u߈/XGYVV[ X+0aͨ^yM쩗j5 ! G,YĿ^?Ԫ&U,X*t´rOo# >Xm"kQC#1ԭ"= hppBLjx@ DMxǽ'j8X@hk7":含KɁR 'X}ଋqо(8]7Qdz:ͯi}qY[L$eŒ})\O~%R!.k:/=>`GZ$Ϧ\'IQ-e?Q$Zch.F$A'H5Pa^L~ϝB ֻ*65S P55.|=,ޏd"wU*3Et}EX[lV C DS48rSUҎA鲹W0 5:m!^AH8RHgpgͺ% y^,NGaI|&<"Oڛ_1nJbT#Mk~4t]拕ȅPV UvىHivi L,z{z1U7`ΔIJMpw/Kkr[֞C죋~)I!p-2uaD`ȧR;HHb <(g.z8;!: l@L3ɼ FzeA?<2|= zP ģR- KzjXTIܬ_XؑN7.ˆԁNw_%ϔ:ԁYuvp3./'^auZ{58vJشY7+pZ/&Bpޏ/~>,j Y߯vrJ,/YgK*~p\u#ny?9۽6ݧ DNC6 -On4ʎg`tM&YaI^Ox)"5VYp[rs}ּLdEP 2탟R_ɕt<;=#˯ID(Z2K-$-h }ھ}̌cxy\ԭYT:Dͬ *%༲&DZke8pmQ#m΃: xH%vX4qfAمۓ [Y$M8sT56 ~q<.hw}l9snWn;owQHcAyevyM&[-.:} oz6w&\ֹ=WYy-񖢹xж 7Ek[Ehqsj`ɑ[]?t]C =~=@#}:>o6Q@'N8'QʔjV* bJh@2=tC'=trЉo F6* b$A:f'PXrQ*,a+TT)pit8x~#Li~GdžaD hXpFkv)@cY?P7`c>}81Ù]},0p9}8qQ}P%9"s tmyDHr9l8uVۛyfv%&!uL6av;.=Len۪7)2swSr۪ɽUs}-m_pA} %Ya%•Zp^Z ),T%!ׇHuܧ2}nTa!{ֆ(rVG1JHrgÎhXL "hHt#a |[M`&a*5 nK)5Ga8z<w?wU}}ۺ'N.\*3_|J67u[:EKI/ rwIMAt) gP  .cb 㙠Q 8D? L:C%:|PU.ך7y 1wDbL*"Q* J3KЈ`@-o6] jhԬEfRR&2Uaƾs F9I#Z*#zyMgwGSUGZ$Ϧ\'I V,e?Q$)ګ-")D RM(ys),. h%TM _ԙh2k`*ݙ"v" NR 6 b!")KAZpiG׃j{ t\[V+` -,wXp?aYa) 20鍐4 otHzoZxS)㩝ЙL<*Aʰ)ꁷ7.ˆԁNw_%ϔ:@ҤW#U$V@^bhZHZ #a^._Ҽoi~-Cd)puYUK,ayBT &[׆1ۆA/&x$ ) X1cE=(FMFSѱ;-c6 uA6Dy-3K]ǧ׻֙b6#unxN#L6t '=3۽S[YOS'ʩIT56~qqN..lwpol9snWn;owqRxAyevyH]7[\}2?vM];7MoSo*xж\DOu=W">6؜J4/<O~&iՏ?|ææ|kß'o=,mqmqڶɿ qʷÕ{7Eb oQN8'QʔjVŽhKL Hr̄aa%(xk#j FRcFq 57૱2E+vNa5xfvvKa{M!K{lv}3trLM6u~#l.ٴM!n3NPg1H)PMVG;>8. L&:*a]-Xh0P0%AC6`Q1,%!6T!ӱwn 8!qS.WnY)jus\у2FlӿtKlӋ HF= ͌!%wޕ$ٿB0%gF gw0~6yDJ\SY`!?IvwV04Rn缱:Qs"\-q$jRCE `$Q #CQ"%? d\Zb -' ~dM>M=~nO]efn}3` |jSzjZ ]u|UJ}=5ƳԦYJRM(~+8\2 ,5(Dk+t lO8/(;,=RI;=W/S<%4%{a-(Jg%l)ʘNNn'U7嵬paJ:Y8)ҢJv 6`opWo=7@dҁ^o_S:^:<{nHp)c*RQa~փo)!= Us!PKEm&XF[Kr)OdSZ-Rt8SmQmJdQ*\RU7%\TpD(!zj`].HߗۿuكKU<$ܢә*YzRj/vD1itWmr9-ҳ~Y:= .@n~פ7: '4Uz)JFOFqwFRsQzR*W(en&MRIG=$νҶ<EùW})&EN, ^E$&%(1W ]etU~G i;p bq[8ujYev^C7j{n(Q?w;pz'޹8vDH*cU[.x4RyVs<^FUi*+k'?m*rD4rE*7:ŵtܩm9SFѐю$@01A(u>@֙A骶72 3K/[&{OF )` _^ `.H2cyaYV`֚8Lʶ4"9`ݨI6/Y|:pMJ]D/%yDHo %\1Q[Ezn&~T,CmŢbOUIS>6T4Uko{R;ݩ KU//9;Jp7 Kɖʻtz"`\$=aVBYV\Z]dau㥴59+ISz0r5 yL6lLkA6>HFjٍYId 2BGňʳM`#mRo8X{z'4~0{_?;ŜdΤAH  R xB&xe,(+7,VdIP?@@MM2h 82uCt2z)0-դ㥨-*Pc$A u,bemrt҆8AHD\+0LYEnHV,!rIVthkFMđ0>*B g7F|״ b5x)"ʈ(;Dq[G HeH񽠫f@-Y 6Ru9/m^r^{ q3FlcTRzTMB.8RVF͍@#Y XIUFj#fu(9:IKqQWEbg Y/޺66m;Kzl{{ַGWw>-'Q}Z Р -tUMHHJ-.ggW!\٣M1$Q?R&1E0!{<D'M,+sPrʼ =j R(H%fmH3\'$i[&KUū gQ]Ω 1&O\"eo'O+t}s>dx9  I?02 $bĘ)S:EgbC '@Dl&[;$FjkzB8T҂!ahẌ*xn{@UwuL|>u܆<헙enmf7x$˥$,ym^dRH'+ X!m/VeL)F1kUx͓kVrUؠ KIwezhSѥ,0T%fUH`}F"J^1NܟMZq.eKWgU7,iEs 8էK:QjsYԮF6jwBjkd㔗:6N+ek9xOr1Fd-$]7?78Lmf%؜xynmH~|A: @r" *y"0 !p n:mȬn(C5#$"Xen+B#x!sr,ծRm8{a,l |=O2~lhi}Mͣw,|W6P^jc/..'B0ЪEfLZ}ZGFl#wIw%/:}q)7nЋdL40%rƂiD[.I5*`m"@^}X1b|`ʺWBLL3靉hMLỷ4WZ8p!@''aj +1MTHtVpoT::Tr) |GEOS"'qe1ܳE`LlRT)R@ρR{Ae7g\xX] uC۹3лoyw >~.'uLnF3g!*SwL~Z#ŝ8.d[_N`puϮVʏx߮ZSnpٱOK\KR] cXf=谱p9s?5}gg8oOߦ]VE|}}kY>>|*2gͤiV->)KzL\ʱ%e@ݛr{1MBkzsS2 &HePog&ROyܭ}ldD9 z.-hR)oa,5>[b() %cNvvݴ/#IB6@F~3``7 `AZӤBRv!iQEl=HLuwOuS$'N>11:NlLEV眔R4JRlBB*'PM'vrtP+ Gv*޹ꉻOeFׁϏ{ׁNtr({g̈́Wd7 IcrFuO$.=vTR?j)[>axԼd8)D2R6F*n#=$0 麵Vֽ=5Os0yҹo[]oxptm 1#pj> {P!}q z' ӥBmvm5,Rt3YI.TA)$]SHq^u RK߉4m۱هzhƇymryggpp-%3chj꠬>7F.mP\`.Qw-X.7ޕ q+kc (IVg>qMZHѤʁծ^M3Abu:kg;z1a3%ey$RpeIy}3+Shڸ!K48ɨ`U;<4,Oḅ]gnY} 3l E dC/ ,1ni}.EIq/4C倴qa|;Yz3#\ .on&:8FgrX,6 ӛG2y%^uu.J smsLQ!bf68㡵N}!AjM0Lɞ"^zrb G$?kPC^ c8 }4ĸY୍(bVzc_K Lu%,HKhu| 1-4 i4}e!|O_'҅`۷uCߍ/7њKEPl꽻pu.,)T]F8~XPՒ- j ̝eN,=:Z#?"蘎iAܯGMQڔT/J\79!YϼIsKobIdzNepuj΁gPKkR2D'R)T$霹,A˸Zŝ|=ڍgո/qexM9kweDyYN׈Y7+Oͻ.bޑeJ# \YGPp\!ohPu17tɥ}>/ޭ*[g*P޶Dok.=:ӲL Ʒ%Slh8D81 5K6> hd҂C϶a[w ۋ!gw\ID"G'c"WJa`ٌf9lj.e9Gg QЧ[&~$<~*#[,&6P;@}|6)~س##^"\ƎVŃ6&Qkc"rhה~KV=9:69J<cRRs "je"Z2& q |%d J,T$E,Xmр6j-@jt2#$d܅aL'ӊ>ߙޭ0zה6Njdw6,^e+9|憿9u`c:VbJESGy:4֖3aHy'ү2y 1hi[6e`\ ]r෽}s$4Zyš.v NjX f|t7FenIqްJkv蓋f{)fV8P'sQ8Xi݊tԳz'3˙|:2ș=]KGegI08G[Y>)+El M:fSR݃.$l.幢ImH2AK^~A!yAHa reZ3jtgMW|Xlܯ1vb>,z*rHb>[!޴PkNDpO/6|V24E)^xY$myMY&˂-hc:uPFTYp Idvf=D%s+AVD j#c5q6#,(XXM3vBcAprlͮ%.nYlzaNw~wCgzfdgR3Mc12fRd`<%&rx-BUJ'HP? A@MtPF =fl $*N]K͈ǣxǸ<ԮڢG>ɚ he$6+dF}"jRq%A3i$}JmϺ~Y*/)bB&fȑ & LPD GQFV(UҮvjl֨_[`<D""VFDG4X2Ē`$(5M|'ˆHMUTك+ Nu~3Fl$Q28M5\pG &"Ȓi䝩)8[8N4Ԟpq')uVӒ]qQUEb:Z"N0mLCI2 -p +R/(]Sjq*xv@ز4ǚ,]#71io]$Jzw#CJ@gTLMM".}&5zMVJbjb-\2)WG p4q8.\=MZˏWOQ\'jסp3"gWE\s6pE7WEJ=\}pbWE`%Z \i-;u")9k+ĮH`Uטs+VzBp*9#"5**<*Қ7=\}p%r;qԍ< nT(^RGuǣGGbq}]+eWo5FOMRV1\0N=N~!tϷkf?鹶Sbiz/?~j:1w ڻZiikd].Fyb1gC[~>E\8"-©[4EJ ZP-Eo?s嬒]{UFʿwu7U9XHчBh+MDX 1do ˫K=ؗa/oJo._Ԛbg=;~7jh"lqp9‘٬YE\ie*<5T?4*,_eRYM{_*4k 3\YWs(sDz|wݕHÞc33O[kx5mzX#3Vd0^BdZ&&evgV7U S>i@j]J}, q(3Z!dR1Gmy2SRO8_6HrZZ357ǝE'\.1&u8zfXV!|6AZ; MB>14̹JfTerzy0爒:}Єd˘,T U"V4fcQk4fDr4js]U(Rܫ4JmkO MVcO֭HXKR 9:ȤK9F5&yRZF\ЫQԿ>\KViG:HN$\ I2&1 ]VDGHu'Ne6;9XƄ #$oe2s DJIZפ|)Lr_J>rӗh[|n[:eZvGӒ#};anWSz;c Wc'nfnˎ:|ѥݩg v;ZRL(ޤJ6L*{e(}7\rn˦}݌ }vmq[{"X۝&Omo_ɂ.O.U $zyA oM7.o^~d-y[ Л_HI1-x V%N`py]Yo9+'m# 4anb AI㲤7%TbI%9 ؒ3Y,&3"}dUzf3'_ήyM& ?=c%ad &1 aip[a_Yӯ{*%]0Rڿm\O_VUrߢ+oɬ iߖ F`9LuO T_pzL ݏcƓ0>ޝD &KRB`m+'JrMau-ΓZIA(C%|u"̐K-/^ D N yr.(9Hu Yk 2ÔsMSgtX6%47U/o]!ِSu~`ukG;ǟ,kr{1ߦ~: i5<Rq=Lf31:ZN;`3ءu!h,dGWy;T!}Ʋ_7"x$H!FcR(e&qT7V" <)\bWryPm壡iTy6kvgM=,[Vݻn.v6LU0ns׵~Sueu uu7o{΋ٖNolq9g[vrQZdn7Mw=6dwN0gEwIC[:΁>]9+u{ ۬7xE]sWwm6Oď/3SKgn,8i?7`nNݻ>X3 {)cD#WjIZЍ}#w?RHuY˔\[r 7(P6de H %$Ҕ+P S>f Eg-R!I,l>"1wo64]M/;#P^]M?Z)weJdY{yzVr {VM,IH1S2dCIK'DZB)-bu+lJM >`$cIBA:aVD͆]WWǕuE7^~qs\)Tr=31K-IM*wɮ6rW,=8{ |jpmY:z\v1 A{/eG$dEGr> o3(]& Yڭ򌻡_ѝ4L&N&u '_% եo=9Y g/^Ň/kWo%@7-l %oPeo|5pkBC툴 i-ilޟ_]W uaB&H >wA.:Qh#zĺi?oN+J2]SRRI(JO"J1{QRT0[[6!i E1& H[XH2JާbtW9.VlںoiWK6>x7cɼ#fnj%4}+PڊvV/#y&c'_/NjLQ@(.0eÓ3$[b#1]4. GB#΀d##he+t"TC#@%f# lBwK8)TTv֧xju/g3N8Yl߃b>tFK2P_Ne:\|1qx 2NXeW'Ӝ=]Oq \IZeBm4uҺAa L- Kih;r}{t-vGAkV9* )2 N(ae T l(b.hhJY2&(I,M$=-2*mrj&$ ,2luiL g;-1aIAKuf\uQdgm1#n=l73esK fDQ3?JnP8\tC?*!$1C&R&hDA#zT W]WkgCoacAzôEy;#{&j}!ϡE="Lwgٯ>:].򣒕*ٯ?"Q)(Xj2"d Xfn(甌0eIUnj31ٯXI_cl#0:QtLk*\|M ɔ5`"SkU3$J6{6>5Vfٮt:~ـc(LN*J_옪>z/*ķUޞ|`a(IR [AMHPuuf0#,`22ZJ%FQwYsRT]X0(RI)[i5&jm gelUf-TmmAmR1O[=̾Ґxta229zW'< nzR(&*P2(#'rgm0>:-ѶqauJBVBǨfS[,D )d{Nx, mᡊtp[ gŎ<L:ں֣M8HY=X+ 1K6:;qkjvc]H׊+02,:irlk2d),GQ|!",*"Z'^o6'i:=bǾ[D-h7IG&by c{-Df36VTl%POJ$sې4/`ܘ hYUO*dSb&$:$p[/gį:]- O^g3).vьvq$^#h%HH]bVA 1NFہbMjN)1vvXa38{h0a scWUvLޏ-߿fѩf6L1- J@.]`F/+"SrP˰f/+o_>@&?fcwyx>minMj#XkzF,JU,D@R2x2u1B(S*iַ,"Ў?x=I{Gx̖k=k4*gK7N!PTo`J&FR7۝cRzy$J+2Xmz(jr:d#ˆW}fk߰BJ:Rl 84Z]FFmH ꢳof;&ǖ~hBBf8\y s~nsݧ{qXɇwhs`xx %,2E N%ohʑYxhSIh*v3tu4cZƱ|5ąGo^p{Dx}3v=WhЮa:΃j|/J\}3^?'UͼOo~+ Tx*G +8d5K-iJ_Ǎ, Ǎ܎g֋~dVhY^Dz/Kùz}ϛ@7"~B*P 嶷|oM#ַ/||3-X<ۚ@av?@d/},^<"b67 N_O;>Iawm吺ݭ{PkӓͿ^T˴3~{9!z_I($]4LMt#?>&v&}e񏉶lHlYIcvi,/`] v e^'#)'ŖN$;eIH)6Wugcs;+r$&Qk):]RveE6e'1_3no?{'1nfeT.i-'d%$^xNON~[n}ɮG~6_~0&m/OQ趓reElG>Ĩ{3dH~.}i;َw1oLeЮEZ#v#v2}!˔`T\⋼)(#y5^rrI j0ƤA7P,uhXKD?ѻoz^;X}!v?!:-uhwr0-RAghqqjL /x1 P, ʿe&[߃~RIE;=/QC1k.b"`C&X% JP5•GZ ؘ E.ZZ']muBRaݖpX}v^ D}Xx?jz&_'B4;|y6>˙ܤ.ѣyV^!9*TGnF*Ԛ] Rsll{ѹZKXLP*r.[Pi*{f*^l2P&OV<.,PzDcb+ۓZssH|z=Gk{"c^;پ$|)IMw9"3FCҬS6S֡. -vň 5L3-19G\>&vĪ*W[wz >?*Gp"MU}pNTGZOUϬuKb&B|2qwˏ̃njjq54Pa+yCZ h'2ceBsB ) {r6E}|[D^RY!Ry(g{vR\-!(hd N+?g {bClvu|ޑT:bFCLP]Ur™3rS.n=槖VBZ?Tb!J\lLEW/ƞ2Aסf>2)oݰAwWls5ca<_ğ bX:_ᷓ}BWN͇39|1IJ&q6^g<.ܮGQ!l>u\F.$W<6?g`^6!QC8VK  > } s -Ҋizr#b]igV+A66䘒Zd BԢcML$O:nydsBU6$孋 !)6jH$b[,J^U2UUP/5d"(2)FKډTlk0PI$5!vɖ v[Ê1Nh)/ZShє%'Raz|9u(ԹsLƒbs3e[a~y$vEGt; Q jKKV@,z2qR۝h h1Z(kߖ.ZJMI⤢ =[ܤ%(K-'Έd1Vm9#c? IS:c!LXxv|7V3\ݎm]^/j}8]~]<3#v`KSU5TIOphCF%k }9({2)09,H@l;#*KUr%9{-aNr<&OEm̈́넣PHe*QufҠlŐRxqg+|oEǜ!j+:5Uv(%DQD[TTc쌇ݖ0FEdcAnTDΈ"NO!Fq`h(~42kV55[`W#&]d̺o4 ٢MBm8JٱpY,i@oט[/yhw,: ..MMJ3. '\Gh@Ҧ(_0(Ug":$MQp:s/8ℋcnq,x螆^m;r+0z_kW9uF1F=>"n~| ^Q UYjZ"m{ նJS ՟*0ؐ0ry|գ2b(0 *;$qH-hlFE*(Ly0g-Ve@*٫Yľa=yD)̶{Ȟ> 2ph+5-dgWSE^gQ>}ϗv ăd4NA/q|Ey3²t؁ 9gTfe3$PqpZf 5M]h[i١{*s5݃X3"b Kj %6 }ѪXDA(FՕB2% VjtHc5œApU9ar-$VEyq<2cob_1=1T|º vW/΄minbvq{Rxg "&ZdT*hu60D"eYWA&1aB '5uFDg";c$[ vhD_34ՄRjJ ّa՜o&=6gCFfSZjNJ9TzDhuM B:zue7l]b ɤVeV]g'L!!i||_#.:S29}+*E9EFc5?W/셑kh4A0 \1Pʲ 1+)(vȵ$..ZIig1)>.ء=C#C-HQo"=@7,_M7B o' @ ۬6*o1 *g,.y7x7g8AM8#nC*rq~qqa *1Kw^fHgE sv6_>01ҭ\-.~ۨYf<&AQYx7ɘ8bk1 TFfzωcsqodi&~T~yO:l_:l5 s gS^X[#*k 'v#{?{Nν/(~9hc8ze/*g3j$Zl*b5>䨲 ^\χ%>^M^q,E3:£ǪW\㞡һst=kM{xqvú^wZ]88`uy g퐼(Ahw>ᲝKǜ0IV#t-y^:yWܙoѭO"D2%SO/lL5LH6u6&!inƮCogR66:2 bˍCeL1i %G'] һ6_<2 FΏ?qYଭ"ʝ_^~\]5 .Ij\?]s?x,w;fxC4Fi ^Ev}mRTiCvc'0bQhAB 4J"'%g=@%)G)iKt, fX hX^4(;cxtGoQDpyN1(\KS)噙K(FE(N1Z-ː`[-$)mlmerNZQ)Vc6Th&EHOrzRTZoN訲eA)(ٖfk`ْ;t76vaq4o9>Js?~{esg]̗>Yˋysv}v5Oӟ]\_.*ۓl2??FO=Kp}ſk :@>r2WwT @>(<}[ ©\eZ/W]dKkFn3ތ@͌Ŭ[XڼP:4V >׺;߷A W1ޜu?].6n/Eq_;Qie:m|} LJ{gi\.C].W|8 c۟ABDQԚ) )%[\+trZZ//KSJ'ۓ^(8 [{gM|͛"-`=bFK`Țz̀$9o<-Ry-{ծnvǭwfKt͕+Dh6/f !}?b`]]q~7wn{|{qҋ;\>G-[ݽt^aoP֚j7.}-|+OerY5t+%[!048[kX.\?c׭ݽsLYA7נKr"1-9*k^D朑"uJG:IrL.`2IsWmN ĺ3hi`]bSpQ~Ȥ%3ci꠬A ac R#6(.0;,J8Εea (IVg>qMZH Qʁ-Z8[Z qyζ^9nS5LJn|!ei\*oe6E^j@hK o%J~:/|:MzUEe>~\dz_fOjlnq~ 8{v՞L |?˟ˡg;f+6{goߟQm<>/C%^py|6gq͑uJZtJwJO(J+LH!*[j5]+UxWoź$K;W%\]+V38u*Qr+(E` XU \h8u( pv):W%U ׈E pbD2ہ+eshzpEk1X坁-S=\A2 ;W%\W%ZW%JizzpU&YKܡ E+zx*Q]l=+ K6/ W W#/33(id)z,G!*i \p5 \hE)p 8W\uJ;Wf+pUxpU\Yہ+yUlew઄+pU~azzp\,ewƮJwJW|p% 坁+ YgUS$\)诧b[g7~2UPpC05ˢ1 ӃD?}&Ц|nӴ=ZG=K=NJ 6 w乨NFxqvU9`(m9~Q`x`BߎBQDOq'lϊwsF6OghLfJzX?F-yɐAbc&N& Q+B>hR97cn00x-NRMN?,YjY# :bnY[HQ( 4^\ֺ3kfM[oQ(4mNQkQiʍ'Z6pߓuxxmfmsb(M) Tja%x`Q)qV`ׇJGA`뎨-"\V8;poՎt" VRSc-KV'gd&k\:EI61B `?c^2GY_ne4Zmrո[yN8qo3 Y텎ʕQ^EKٸn![ZF]ncn*PMeʾl4h=SmvEg7ӋaΫdzY[׋.]KyK'\a(5| c8{XF'@?omTDlTs8oF߃M|Xsי/j. )AlODJxvsgS-u[vv[ygef5 ۫7bn/p=:1lzʃz(  N[0+ >k:c`sg[ l (G0-%dLkZwy5-5l8E ŊZ: Dy|㝇e׫<ڞbAV}r{v3+H!pg@8KCN$ xtH,7yђŔU=$ԬKeIr-kk3r1+D5Ƅ k HdT-ݞNQavNi7W{XAo10xbODi~d_(d@w,w=y!y;U)A+U|/ F)]]CP{ ӛ _0*Hvhu'U:}2{VV'o%)n31VMQ0t2$r" B`EV^V~%fEDN6IS疇齈+x!~Bj=xH>zEBY(fTğ/' T+H W:6؀D3g晨Ng$xf3` ')rQҍBpKlWk'*ϴف(x\IVeAX+:fC,Qȕyfpv9.ԇ׫[y;{fYxj/_.|>n&MU/,{Z{T.e ⫒|e\Mf٬ٓױbxV0^.O ү \|1 ěAoJ{JBJIQC2ͩIsW a:B-IAJS!:˓6bLԦs-J k9<ޫ50 gS;yeqHeن0Saz׶{BTO0P}g 0WR*kek f|ЀDQY45)vosAO;kNhלob!EBB N)Z{1TJIHb ACx6Xxչ_KI `pK_ r!Μ&w;+^iopv#I[!L pANmw`IL:tkTxxK0ffYFL͵9$c6sP2mY幢mH2A?K^~A*!1ȕkLPv#4n<)w"Wߏl:],fNV@D~j9\wp:SoN\v_o9?NEl&fkweoUT~ƣq$7$ȥM<, X6P;W,w&2Xe#'b:Ȩ1fY@ГBYBFmz.EWdD j#cpv#c=R ՌC*c!XX<0Clx9~I&_ /e|fdgR3Mc12fRd`<%ݮux-B Swc5"0=() ¨cf@ЪL2zjhW$橠vq(jʨ-zA8k2&Pƣ5۬t+ I#cEUToB&fȑTtIĂ&("QtQCJsZ8oi<,ˇǡ{Dq[7X,5Cbِ$ب\Yxp% <ݒf8c?H'ǩ!3>`0Ʉ@JP 9ZFěDE 'ͼqXgZr(.ʸ({\qq[ĸ>LGzaXn@V3 @P»OՎSCu>-m}2M_>rh`'Oy0۾[x]"~ ~Lhh$PRFv2!:cD&kBF&-8 b)Ayڙb-$d@A|( 0NG r Xe؀e 6r t;Od{y_8=ҼL~: )&xb [gy-*/ä>kBxV%9eؑa)k6?+I6)"Gɉ p 4PQZz3D$ixY$В1A\@UfCQaNrY)YmZ((eF$sI4. s-W g79"LN+hm=byOI^ah>7G3-'T+1Z)J#Hɔ^0eƯr\h doXRZHk%5+U="< "<=0 <`ɤpMfLJm*2.)!2*JvqG@0yPB"6!&.v7wI_)9 ~? dsr%A?mȒ"ɓu߯dY6%ZmiAfՏUz/2;pf2nQO2_2"!Ijp<4:zTF,! H8>GJI<x:=4BO<_Xa)Kj QgzÓ$ҳVpHBj_6{Ίs6tq9jkU%b,GR%m":Z(KIQ$#K7*Za71^0e&~4g?~hx,*wIIN 8'T2o}!P˓5OjoTk."ZQ|۲zrYE6D*zE4}/fN)ƾɏqyWP2 ./,ƷVREf-%Kj7>\)2` ރR}]_~xZ<\ ګ\=63'qJ V0$; 8U8u\lSڢ۠*j #VDsXx]J읰[\UX?Y)a4/UVN~䮁+NgJ\.G~Ӫ|opŨZ2ri4a|]_7_-^T p>U{"-C\v]MP>f SeNxC3L/+>"0Kȟ-LRTO5Q F0IEMKIBOhK-!Eiz? (C4罥BHhP"u;]nb=SۈPB_5FmyWjn;ZMng@VqSk2w-pU_][=d^Ҭҭ kLndҒ+&.B,>)m4~ K 4Fs.>z3U݋[rbQ/fUzzJDi +p4vįWOׁpГ:,O+ծrC=" Kv4pUGWە8 WK+zzpŸU/NUf+dWz,p*siWo#̰GW]M2F:\e.m EB(MaMUɮ=2vp\ ہ+)1 xή2h̭<̥Ϯ"\)a0GWȰh*ٱUVC̥=\ERӻp|QYܩƢQ÷OڬdDD I&$b%VօRQiP%+5_Hput-0g.dݎd 6婭u~|i۷w'[2A+w'ߺ{{J,VTE:u\ Z G1> bIGP:`4.H^T.$Vl7{ov{[RuM(y3\/wr Q8`j2$Dzr?K-vQ6j.> ;suY^l>bB}F_d5.zh y~죡g^->Ow+IngQ0+cHvl$J-k4]&+stu&F2mdQhe 0!ˬB:&+P1B)VQAr2P 7i#2z +jglWTARם>,N=Y[%d9ݕ~zg,?|਼K<4S *XHog4f0VHiAFxȨN1&A U1&NBD]EѤ$6IT;ۑ;[3,L2vB1  usd˸سٲvږo~5pl#v&9)P7 0SZD"LHO2:iK:T O!;{.2½geJ%Q3x161aAXU֛؝xW\CAΤcW6Q`7Y0eƒ$6)$"QĕdH#~ HRi%Dː Xŀp>8TDI6@+vv<,Ň Ǯ(:FD#b$XtĀ0@qdЛ谡‡J9yJ;@m45?7)8ͪGQKg|P`B@O ͝"06;-Y.G\-:YggR+.qQ(?+8#@(O>2lj6Tu$[j@R01Žq9pP38h&=Fp|> t`` ̠Nxˆ>9 *Ljd<)V2""&ZH0<D;V;kNK4tL>z4_ΏP?r{>$]NAA;]jzxZ}gѷr(}qs2FRd=/Z`18^KYrZs {,!rƽ`+j uʪ倩 eTS˽ @"X|>&T'1&^LU:+HC$.OOV(LZEp7+x(Wo o8Va9 z"{a4 Vcqa2\(Km20FFL ߋFK/Mh1BKcX8?@QZ`#,rGaydgTdF_Ipݸqu^,9FzI_NMh0g.CC PԵf7S?.xu$F^LeX|b cz1p18?#gľS7NvWIί¼8 Uv0dp[LeJ &v-LPva&~\`βäOxqkTFKyJdM {$uц:mGOePo?nUTe*1Š [u6u:x:oWGm5V=E=+G)pvop<`.eP,|݀R{(#;>n~MVQHs8ΧWY9rNWӾx1,;u|Kp~"Xa.(Gp3Bn2#,ؒ:Cu_d?a"&shzef?MӪ]Z}=ϭ啦'9TSe?}6Yq gހМY3 ]skbBG@e'ςփ>8/Og紐 c4_t1uOf~(K?NIy |3v@;U?MUvsT8$2,G zr~VYf̓hF%( y/4M8h@e')puOо")$煉pSÌĕ6?w39%%;[G9:]=}ZnxsUs}|~ Z@T=SFMw׽1u7}BSb2KGj{k.M7-#4tP]0{e"e|[SLDU.hk[WXx0t&vIMd.`o)_)傥O $v& c1cyFHIA7ۻ93-8}yơYHt`#ϢŸXHp"sG #(T zā.x_ώ&sVWq,t ^VP[R s?HBb&fOn~ͫ.O 0eBIknX΢pVbBB BF!2{Ɛ"p ue& 6^o?A< kpYsZh'V!*OSKF.z(n"ᕦU0,5,+mv4)C)),{`b 㙠)frL8Fv5&@|B-4 Dip#4TD!*fEwZI%̈``"Ay*Q42>0 UctC^{ʾ*.k |Gv~;[L8,zp}ܩL$eYIf&> elӜhgqc-4]V,lZ4q^ -,VRE5F+z4$gA ״ u b?zw M6<~t^oT쪙"v="YV[>D0@d0Cs)(H+70EvP,`M{˿ a0Q$B rRH1wq+- dQg+dI!6YkF2)"$)DG X4QBrs_/<>#?rF`bI 7:h$[-c BHR`hTNSС3x@O7tX{]#7wt`륓tL4e\vn:PWA] QjK%Bd):Dͬ *%NQcpԠk!DBTbD B*CP!k5f ZDD4zl5RZ":ƈ5S'/>"ǞjLr1vrWK.n:m{Z,G`.i&kz=@^ۓ.]#69ZLB,.U1fX9nt^IbcosO<طdu=wp^R^+z˻JkoIl)xzʝ=zF5,vClIWoi !i[nLtw]=ӭ뵄wށs2ZLάfSB3!{ݽ{Huw1Iڨ(@%"Hnj$j n, f }g%@Sj1pDwPC0—M/=9}P{L+=4A0ɵ r B($<3`NLj9-J1Tk%6|fs*wYq"dOc_Jy4>lҢykB2K,|d<7 }56?N.'ӛI%[j.{F{)9mo>xcF 4ZƲ䣌{7I/; AfnBP䊥0c* oku/]z(Z>#s>.t gʢ5"P\rJa6>)t\ _dSou^olJa^iGjd>9HsE%K@v_v:Ϗu-kb#WBZ i4FE"7c0x/:X^r+{A6ƒEmCb=b?f!n0q(dT$xI0 !P 6L@•Xd1+!j+;=ŤcB)7j<[֑aYqGkRR>k%'RHDcׁ;kSm)`o|/^8J=΀Aq{+ANҝC?)eM;`_bPEf$utZ :_5*ս5[z##5сm9`-3HG]D;͈1D-G@BQ*cUq^oMoց9ـMࣇ19t1kӟSg\C gJg=ܧRc*8<]$y.e))t~+ n c(}=Ok9oI%[L)ޮZ|eSתWr<#{wg[䕲vw5̾.O/4YVݬ,gIQ !T}:u &"n{dzjl>ryM׈KRٻF$W=<"/i{3=/vSiTw ,%RR,d_Fd1bMǂJ Hƃ(BF`fbuAbuDf: 8'&X"f)2}&ೊ0m2`5LK85MTV; Tx㩎F&!o'ZZҠ%a1rV.13u87GGn/} 9;6%-72ֲn"PL.#{ĴkE5 Wۅ C1 {MZ'wH,8 Q 5"V1@yp>x'UT=8Y;1'UR$'tb p2'\O"S屟Xd* '̉[szpڈ`T W;ga6V6R+\z*9S+$Xpv2pp*p EWWL?~B 'W\Or%X8zT++ 6 \qE ;9;=vTJh(ʜ\!RU&Q:vTv5•PL\sRN{gl ܟ{nipǣ%~ q Fj0u~d;M\k'3s=H |('GC'0ygz8*׼`T'kS/?~f,,3ߙXM:;M2&&yBY^.k ̤>&O47`DY箵h^E#58g.E 3]Bߜu 7ΡSt-sK z̶ J\}{y׼h j1~B,8ɕTVqGs>0(^Ayȥ~yk#o{e`MfD/~*C>:i4 ܿ[E-u/[c Kəd'B"/#1RwH~rK/kФ~`GK[o}Yg.›3&4UNdƜA QE4;z˅S/K>Ry!淳ŀ*鷳oooGv!u?Sħ~,)u(wб,h͖:3|QCYaٺCf:]lzR.k9yc%ryetᮭ9'/> oҁj \el #ʽiG }ji\QL8kZ$*GѢ0mV9:rib%QhςL B8Gz^[ǒL9L@3ndUhf Jq&̈́e6TByΒj}Y#gr8W2Q1j6ff=8O71o7/ӌ?*xRxyb&CQQJFʠU º@B(P52v+C+o}#M5h>HڂK8 Z삕~7obД}ss†bm{_W~mȏ_Pk"e<'y \|RB˄ 5J5l^g%7e>(RGcpHCeVJJ9T-)Qc !xj(EbaES:oC"e4O)O6o+djccVK}a)dWs:ۯדRkG"'&ig }S$q1Ìt4*$퉶PʚhF%((Y)DYȍLˌR+yfX?VldtW7~J2>''K~ފi=Xh}zT*7ʫw@[47@4$HƣKobѹ:2CAiExTjL_jBHΑ '\ye8l :"1H*9MlBj#g5264qƶXBb^2jЊ<*Ě,2%*Qx @J acҨDf`<Dl?EDhEq6hENP(cE4Z 6xd1Oq+mOf1QY(q5D%Uk]2QgGw\>'r{oQ/;3섲"ɕd5!}LmvW5Wq;UY(wޕZ'}e+Jsх^=z͇#hfB@f\:XFyT$ c)?&$Z)XLƻhtL[wo,sK66ޟ+ޖ1=K:cELSlj "R):xdAq`LE B劍[`h^;ɧhL5GW^j%;$GR$7m»F]p-<^W^-=#K=Ja[ IȵPH5o Ac7&s5H})E̮48dehQ)EM[vn5`~ِgz(ƹ]k09oDi#peTdH D'%< K72({(G)ɞQ̾ǫeu? F7q4/S?Fqg&Kߕ`!3oy0ZJJ+3`7uhUKn*/ZF:յ`:ZG]uj.03ZAC1wkлw5al)Qc;2t&fѥdPV8|.ݥu CCT ecfDLGުŬ7l.j-hk[uX614>hbݤvfک @+mEif} iBB lot|dΡ96!р+#/jIԻhZzWބcpQP.0mW[ܷe@vx4ec) Hjߠ^xXuy\yH9LM8!I1+}77~y9/toz nޤW9M&6'gW)ų%OL:ncsؙ҃M:yYǍuƋkBb~>SVXl;͞/7{ϧ?[1 İ$*ҨEmYA|k;\9^Ee]1SEgmH]ӷ?i5C^Qwbl׬QrZAaBNQƂ&JS5j90ǜSiKS )R泞De W:Qlof٪ɫVə$q<`P6o϶9ꠂAy}~K\仸![eZ[)P4 'B!xe<2jp33@H7_u`[PaL^,"6L6L 6TX hJ5hq DeZkgѸ]t8^c^$Ac@_J Rƙg`64h謕Ji(3Ǫ?Os\v|3uB]f.^~z?{R[jͅRD{vz}+g>Yg J0X|]~_>ţ\goFY'7tw[i]"dqEwY.fAɼ u]*SG޽N9g|pt/_va4-|,WtgNĻb{Vy30 \\i[ ;k7LqNhf V@m!'QA$D8M\JX=>k߿[ӶzHh5o/g/ei2$gͬiVhp|}Rm \RݫnۮK2->&[ӛYa@xd+`0 |eɪ}]8mI zӰ@0b,)MBՀpL`MJ\ 4>9ПM;|>#?@V" [@t:M,+g' >p3ES*M V#1g[ ˖غαu_n%غgcoxM_#Є;xRTzB4??jHT&w׿,w }UnLܜ+hynn3ߵX &,ݽ^'km_)g<-Pr+w?S)%Hҁ%$r^T^X#YbzEX*SzjqԓG;"\tH\W O; nzC0އ&pJ"vU~bbN* H{G#^r?bj8TW m*3*ѕD E̤h֌IlA 3qK$3,E&ϱnN ҄@xSS6: 2T[.PHAx#QDwJ"CB./"dpHՔVdHOhRͺ9qsMC.MCޅw+D~VIV[ƪ0vuTLxU-Tq:зWC|}?kfi%R*^4 8Ik7Y ++f<7) ޳%ZABPpHß>@:Cd~mj:7%>,zoC-t;Cw7ezw7Dx?gL⁋^o۷w'm]/ÏpKEnTG6YlU=x`~QeᄅAa.[ :j j0v鐩BgT'T:&Aé6ݻ^M U `*@LHEj'8!d t4S}ja\tU^βQ -DϥVj Q@#wD8SR$BF% Y[-@\C[礕.{"nif>ެ?7{m(BgzSϰz6m7`%¹i8)B5`Cjry,„![5ŷ^<+\uACk|~ /l}myH뙐S 5ˋ}_Og@?я!@;6/j}OyGvmW% ;^jdo#BW2G5WQx_QKuN9x\+A<8q1Dh9,)gҜ|pի3p?#N~Yxlɶ+xK^Y=̾8[acQ.~9;SKn7whx8%ޠÏɌvJm]|HE5GESoƣO[GtZ.2K7s=U5'.`À*wV];rgUʝU*wVΪٙD81U*wVjvf;rgUB@**wV1U;rgUʝU*wVξE'dʝUxWΪY]PrgUʝUu`UU~PAU?}ɝT5Mf4&^kxMi5MBr8ЗTL 7XSL 7ЗRL ZL)&z~T' X ўo \[H Xc!hl㔐Ȩ9e,?&h_(jp x}+,1RΌUIFU"G*(@:E<&)zElEL$А҅a@ѿhΤ3%- rT"4%+5g?C8@oi3YU?Aͧsio'_?5N ;xybP$RP*.Dԁ (W qHd1"wH6{l˧իH ]anml8/~]7\%bCwbx [Q%ݦ?"Kk{[36 >6ͪ z_j>_V۝]QQ- J?Q'MTJA),9LKS4?:&ޟZKa]:D:vD=ILRh(xbq慽6^"؝p`L'LI2"0&$DZ"m(;x2"C*E{5W,2GQŁ(ɐ*j][o#DZ+=%FEısN$"GXZ,E2jmy~g)ICam\x96g;㏧"E="xcNaBj|DZĀ$lD >$U͓N9vd %A NQ5R}0(9ВfBsBו"gXS}uv%OE1.{\\jgG8F8!$%=Q ]agܱ/x;y75r7 (-YMFQnƾPlGçe0>?r[[2֖9\!w)ٮpW\59jxdM1ubZ!mqe^&1G 2 H%T-L%ʬbN0p5|Z*v W[y?\:;by[^PL051vkZu#|8/5~nS_&'qOy%}M{O޲_}۩J2ؗ29R3O)vp0ͼRP* }nt3;:8PƠ> &"z(>9<:R3a(_RL8 W&fi(SFg15Ӭjߏ3xӟ?`hDa8(U ۤ'.!euqfd;z0=ӿԝ}Kp\4`JOX*CEP6 Ji S3x^fÊo%.;m:k~- y}K;'rźyeryMh{*?}rqur̝IAZZ_"RK) B4M&:~;~ ᬝɴ:.w}0>~;MGa48z}7i)Woǹq9j<߉t#??w?2@fMYC^7f ɬlg\j}V |Mo(9aT.KAo!G aqzzsh'7pP)O7Y <&T.)*xa&>#P2[ifkDױ# ߭6_4qF?!hBe@[|Mһ-vA͜&O8Ez LКk(jMrޱٰrݽƓˏ-EkMV?Ϡ~ :ϠID.,2r V:QBiהhʮΠ]~:vH"Ò=% c}mu<w02g f}chbv~|^Xewʾ@8.*fg׭vkNnQYz^_bCSsl5#՚ S 7Aʊx9VYmTM?'!_͜PjW%ߍG/$~v9yYKɼG}9{ivtvf/LH[1@.~Zcˍ2UcgŏCw=}?n*W=q~wF;190]v2&'ZqDp 椚ru[f N֧ɻ@jKKTɥ2kw y8a8gyxDu5I?3v5.-A{3ggkYf:2cmѪwՄVrH o9\G4u/ʡVbe]4WnQ L׸sK \tg߶PMqqb[ O5RsGaJϓZF]2)J۹/_Q BG4x -΂%:Fϣ:%PJ&OKΆȵ*_OğWēԒ@y^؞Uu9tVr::eon/`kv沂9G0<1M$pTQ9kq "+YkwNwo.osj#W߷~:kbh/@j$kIhty |uf9my>,3_^j56hw^gwWoꥻh:?k 0բ'ê3|@w߶uhC5O9=)椎'˓]`跧O=(G:5!Or b4^IJ /=Ui\\QC_ A:SFlWUBK /B{0Шi>=A;HHEH/(H3ӦSVdbo@cp[F+-*S.yg+ؤ팜-]5m=_z&p:}+P||Fo,H=f")jV2v޾J{9{ZO8v[,LJmƔ\mWXhj L R*u(A*Z#=H%SiM+2;򃉩.E2JPf!gU&ײC+A9!f*SImW_"\qkǨiQo!W)曣I }sƗb3*W p 6UWΟN Z*A4^shQa_ȳzOpNcyڬV.}ހ/w)e!W|JJls\T.c)\S&(߼92el΋PFDR 4Do=)P݇ߩr`6䷩Hot63_?=1>! |OD(`$i|LJ񗅬ւS G] ԋ>^=2pY6T;fynq>:%~*\<`s( Vԇ$2/lC 3GOQК1:WuDe(-2*"b E+nZA Y4H먺YU.ʌ$5I8P*=I*FrEq+Ry N*גA)遧Mc=+G y%esxz~r3VHI< Q"B !vGN'. 2_,Fp2rԣD[fQȘP! O! %Q !$cw&{XD;ZX7eKC&um]N؞Ȏ.@< p~x=3q-mEb%6ISDr4?{6 ng~f3sfM x"K('q|&)YEI)KHll6Q]zNfN+5]/6O y9n>~@JEfzM 8#j. ._ƚz,'|)HziJdhEŲ!oo', 1γWսv $ܙlk2_Л8^e˔&i^9zTM&ugr$`\ W9#,+pӐzHNh P~7M[D?^uSjW37#XZs IS#O{B]S><8ҪZ,Hrms`>.g\ӈrĹ5N4CBG\oɫQpz/I~"5_1F'C7^?{yN^E߿,]ʾ$:e4= :vy4~_?^^b{Q.Ϋ?Sڬt[8YJ;tzV%x)Rvfz['}iȡ}EAKL|zr{1/oR:;mNfp #Rz.1]KiOc\ND)%ʇ1c(MezO6kZm n}]2v L ɷ\b骀.ϯa{nS)sjttwC-sO6nKjP]pMgN=6 V5: o]ceEE&uP$r͖!RdM8jhdfB]H;)AF:H6 r< G&JBޓ`5wG,I 45~pTj5Ӱ@9y ?8sWyHAt$+B? )nGŒQk]V;MUP|wB)_JM͒$XjXJʄ@Ro=gM;pr5Z$Z*#zvԥgxݮStֵHhXH\gffl)XY㿖j$Zc4^_}"I u&S10bwLVtPb@ȸs/{3HiZ-r/W`o/HR'&3njbU! DNGB ʝWsZL`O: }``F-p0 %V}toy+8à N8'K*gD>#SD9hO[K2~kR55ˮbטuo*\{u/&8i] b{s҇n17裀Aװ~ a;'*]A0-٢ʩfd.hI!6kF2)"$)G X4QBr窓sO\>_ Lz#d1$4[ꭖ1!Ln$)\4y*e|>}Q:F;:CY}߽Gz4,u:=JLd:!UxO- HD9. K'`) 8ͲŧzWsf~04hھ*ֽBg{tm31-SpvaA~˩OflFPs(Դb6[i\ R(0Se4R^)JsWL#,8ZI:aYFWچt6I /V}N 6;w Xog.$5i#]rJa6>)9iK<,xHH=k_BzkK5?2f!!-{CZH@I"7}0Yߵ9VDKrY;u5܃8޻gۑn0q(dT$xI0 ! 6L@•Xd1+!j+;_V|1 y9"暠=Jc%րFb&AAɲKF0W7(_di'@K%)YӍ= ")Wjf-8Cl!8Cp/=x o4[\JR*pJŽl5E:6EZ:i!(cU4V9A^+kiY%ȯ5%a4MiUQmQ'`w Aǫ9#'#{lh褮H]J -p^seˬ_LX̵Q*g er+gI1$=^Gtφb3=(ÝA,x,shR/I2{۹=`iGoC|ՖB_h:_m9۫p|4x1SέiVL|6G{ O~ ó(G&d]|nO C?L?4ϙ+~>ݰR24l!5qn+.v> ~9JieKXrb| {9&[Jo<>*(<>qtFKb#6;D8mvpH@4DR0Xr^DPJI!;%1 y& qɈlE44P21bzAyo@Y"1Z(O./#kT)b⬙%u_9mv:<>tG_TxdE5 zM ͕Xfڀ1 2b - 2H4#2z *JP]L]TtybU& zrE+zX@PCn(z-ߙ:9m9MV}Gj/P}~_ȋUarAS h*XHϣ3:3Tl Ne͔)=4h1qz hk3'I4DEǰI J#c1qV#c9_ӌMBVpXx)Q~E,u'%&7;4 }ݝ]иѿN~#v&9)6 0SZD"LHOr>N9+nʲSvC\d{8˰ɕJx%fc;blb‚*4;E)qV# \+]L;6Em^yE.& `xa%ImR:KI#E(Q+4FAGeTo=C*04bMfB1 84DIv@;W۠~$qW~l0"i'`noV F64TT3N9ʹ (`)㌏> LI3T\UtR⬑iU{I.uӒMqQEYq t+8#@(O>2lj64u %,q )R\~Aq*.nwiǮ a%EE'&SЎ+gb|K YH5!70̝gS 5o0|4p_8h ]v<;xG&.v)|?O`.9b Wߗt?iۮ3 IvL <>I8n^|g̾.O- nЃk!e_ xN"*1 y:zPg3?ϼlvvًt8s/I'x f|^{jX>>6>.Ɨ-:gqRGluZ3U4SŔW?°3lg^EN拕;0w5H>_L绗ߟEvz/? 5ÃݼF<ᙟ0^38gx"AdP$¬#`xi'&LKN3́jh*.<儉\QD(hY wޏv_D:ݯ[yjA-<'ILD N 0"J(j'@ߧ[5gTVnJ_ZK+{v۽r׸|2,MEWMEV]ϰRjY3, T=SKQXrE-EQKQRE-EQKQRœJZkxtߕ}WJ]+wtߕ}WJ]+wtߕ}WԮtߕ}W}WJ]q}WJ]+wt{(RDsNA\;R>dPE=1>8%o+QT܇ KW7>sY X+S]ҒokR~[֡ﭑCG;RrM6 c? {:/OKK˩0>ٻ(߹J.P.@D˔1Y #;)F;5M}Y 1CedzN=Ĝɛ*/}G"N(0GV(lxqB4$ڼf|dtȀ=og!fe^A E1)(i0rVHI'τ0 =pJ1HF;({38;s'u˯I{`? Da8(xʵxoss͡ 3*]@A7*0T`x@ x0$p^^kM@6렌Ź^F5@D@ThUxY4W; 6# ~'ޭ6ճݭ؁W,o)@"<7ط}Q푐$5I8:=I*FrEq+Rk> &tXMɥ9;YIDk+0QRZ]U+W#|lmNG9q4l0}Z^'o|UPق$V%H%ʋ;%DL%&^*$x3x՝\epq^4i<0PWԿg98}A{| 0u'o4K_4`OXjÈE6(m"0{?2g3x~G??zQiq'IcKߴ_S^^{&kw#GDŽf#taU虄 Ohᱱߞ:|oa he!ro\(ǭ[g-sYu|3T}k÷o39ı ˷\^ཞwLgl^~,0#QeD뱹@6H~%3AݖEC%!LC(ȭ-B4ޡcW3K:qG,N7oEc,I9@Bؽ;kJ FNȒ$!9WpFĸ  @) W^[܃l:,Md먍ٻzb.zΤur{KZAŃc4uPVyX n#6(E.^v-X*7YRe8B Xq6y>LGfL* [p1qV[~E־j,r;BqE,)dyv ڡe[Gh]{oIr*]܎]`I9$0uH-IYݯzHJԃI=-ќWuwUjQ;e-Jpd%ݷۇ6vM\Qy>`X05o&{Xazh zL8nNdAŐ"0 KuV@bf];ҝAe65=3+i+0.sy q8i] ×xcfv. ]k$”Qn$2l,^f뤊 IzB 0R=T|1Pt2n?b"]4C,R8&ĔǜH&Jk#Eؿok3c0D Hg*6@ciŜ0rʂ5Qx8{AVJ}19F]J,Ơ,*d>UJT.Vts (u0o>BCkPxOSq>~_wpRg1~N{zTx%08¥70LGU'˻*=|WN[oldu)$0빴 K%8# 2Y'k|PRJƜgn.Os7(M`@&42yN:fR "grF5-n;,{0@ޟډ?u;]k1^կՠK W}KRrЁYoe2km̜?aSYTm}\ (>(uJ'gOGGL ~;dRPȕ|'w^qTڥp\LZ Sj#g&B"nߚwGm E~([`@hB:vN]}v;NbZ~bo96auTmj~lU{ّ_' 5?Z5Kp*#z@3}hCi͌ʶ;KGH/~EMG:t⏇#ɼΞٗJ4K)A %x ΀PKR팒uJ]^47x7B">ࠞx \fZ^b G; 3y^Xyd2$C="@bp.ifHQlur*pCF6l4#"eÅ07Y@H N?VQjJ~ANaU^}>i]nlZvm ?G V&磜%7 mHŠd [rPKcP,ܻV[/dJŝ|2HEc2N5}گx֥.Q*WL2J<\\Tp(!=:0F_tgy(g1 WFk4a*)(K2!t\,qsֈXgm6rW6=i嚧+4g}nO2z ٴ-{={!4gӦY99"iP.\f"ԴER1+mNJì W5.I>{7<Ul-Eu6W` .$?s=?o{FAiG(㴺6W7^q0b !J !i3ᢃ|pKGR}.m[sfavTz|t?-L_T`(π*3PџݟT=&YiŪLҸɈ'Zͱ$gܺ[UY[$m&h\%etdu'8fZ!xӮCBՈZjסZR}P F"QWD`r_UVB^]}0&9j/hΪ[-`lW[ݞzO*Ս֟_~]t J 䡶LZuE't~T¿{l-#2T$vL,"U\RGd]ˆ붣,,tLDfx*1]2F2R |fd/]!wF!iةcm/᧛c,@]8&U;_@Ӱ]ʹloO;ldt6rwK:YIxQ8΄%Ւ!xoMv"Av FӸ_˧+J/]A+%6n}TmQUQUoj7QͯڧXAMf笠gv]fC Pk9dPQs4ް6sOzڜC9v'rEL6) $ൖ˜@3V`g)ŨDh,2)l4Q Ԋ$!EZ%2OȉީM1l쌜{vsUe*OhcSBl#6 e"Xĕ9p^IcљmdTdFBi5I@vU>+^>%9)@Bb@(a:a6v,s)Cu8q1>.weeg|UUT|NǗMו =I-%'ͫTU|cwܩ֣ɜ7׏K-Nݞ/i= ɕCrV*IutPvg3\Z]dɢSͨ)XA@cI-$Ysds)ۃIs-lRgRR]kȹ[3vU:Ӆqƶ:օ'Յ ئop{N.'q:l>Ǝfo1'3ɦE)6C,3%2H=i2v[Ȋ,i0(jSh Z&v̺ ҡt:Fuȹ[c0-;mXk^kV'Y"hEP,MSY'U ! SVї) J%|Pr2fw\FD=#~|GaڧTZ"ۛTB.}I-ʝO-T.lJ )΃Gp eǣT~.u렃mp9Gx5Afx%EoY4qJvܶ]jƞ(n{+Z&-@i/m' ٺ\ dGx,NuJ踍TYb܆>98 ^z+t:sϸqi+}ǓWdnZ<1nm}ũMy)  S,[ʰyYP]fQd$gw祜]=Cu_0/ʶ)WiϓǻMwiPpmfil[zW ڔNsp>r:2RCwv\< #֖ jźe}PU UԱ`YnJm*J3AAO۠N]JjPΗGT^XW^3۳d%wk@r6+g 4@0lPRkṖQrCdI2<4Tj`G:Rb&y$l@Fe C ΂|ȷ.tvfh31綑fha؅5v/C,*%.!q.7P! :~IFgGN X0(p> J♇UV)4FƐ:>zYX1JWq'_MOnEvʑ% fo  *#VxK$1O5RtI Ȓ%zs;|ꟐsynU K4vjNL @26fOR^Gx.8%ؚtNĭ'/YJP`KAO00TS05c2k rK:C2`904F+T/\Y@s׮j[Qi8/F^ڠLiL )&'WS'Ă$EJ iA['`4b`)5ri$Anˀ1D^i ɖ`jRzHf%*v"ඦM }!_VY?xLI׺Ody>UxZxF:bHzg|%X V!ZZL)@WDHJ$ }]UvAp/\^t}M1iQL}Lz"0h ~0EAACDDftLkL>ŏ~nGiYfYWN5yHŖ"9|"d$UR(\'Co\Sa(Bea8G̢h uʪNtrJQq2AP^P "XHѲ[w=ޞ(j YoHU ɻ7Q?Jݭ-h&|͌m E?DXEfh s t\33*&OT,#0F''o%='Q4&Q$=|9? [E#GY$!ydgTd˷_u~yW ZE8.r|XFz-f0Nrj'c.~CMmCM5>D'YOJK,x ӔHPhRxU+pN`Tbcʭ>w8^0,gʯYon+jgpt?,uv\Pd"Ke@4:#!u-"%i7(CştF zak3]ֿ2뎅U.~oٰ9GxBS9w$Zڙ$HrmK >dRӈJĹ5N b1)!F^G,xj\9vŨCe܎Bw?jer_U.&:=߆M >Qz7;wp͏ߧ_?\ub! ~@e*kr)}餳oyK~ck}ƦzpgϵL@hʦl^nq |{S,_:ZZ@{s)4*5*j葿wh6p+h{PzxSjj`NclPj;xo7 ݐ1 eU &.T0To]v5$ڰx,:Zt4mYYz)ItXE;=G# HIq'L;<ؗIuZ ._9*q6:bg)$zٓNOA+c ,n t3,joo8du2Nf8;W .fޖF\W6@S6dQ::~}q ]x7N'ILyX\2`6@ʚP L,LI4i=soFBo>ia z?u~zmF<e,!]Znu '1beP|ӭm2_ հ%%Ƣc\W]F.Q3?0:eRҕb q,\ %ȗZrV:305c,`lrPIo17 vU8Ύ?ٹgG`RGTRJ-X&IDc\3곛 ;]G|)!E גjaqGe/dɁ2:Wf"ob1"(P}?IӼ+on1]l,UAVwf]5]Zy'3g\wK&TaFƜ!k6282}4ޡ.C` Wn;ŋaSOωyKttH[0f|ٟ0$ÈI^|osp2-9[Ϋ0u4FV^m#m eaҵtlh.g6 ۄ'3}6*1j[f~ JqvK0P|}،nw8a'Vvɜq& @? "mθw_MΝ0!Pz2qʖƀyo걮VǫeՎjGgw#\"Z&Z#Dc4ҨYﰣ4Z"yHѻ "Fوĩx`{gT;q*{n_s, =`:+LJD4"23AQ x0R[BRfmˏ uxP:{jLA^G*RM`bFǤ"2WQf܁!H&uz Q'NexW#.Z X+ ̨^yM詗j5 ! G,9RUjwV`q B ӊb=.`mGG ĀDsBviH"'RJg9#Q%({Oq$(ՑDcBu%Y(%9#E w]|,~7R@M8/Џeջj)3T-h2L= /4Le Em ?L$e0_ {br=ZLxDT FyO2Ȝ 0ƧKSu.mhk|= Y\ǫY.^?̚#Ѻ=ڛ^f"eq3HQKSC11PT4Q"@ȸmr077qct- n#S.O$$u* D:#ԹrSUo;xu'uټV +@ -,qXz;`֞{ݜ[΅+ l){"d[M{64F|UMZ\STsXA[a'_B{U#P:۝>,%ࣂMooR\OLtIޕM.,Bݽvx'Q+@) f)"B"j2I "&*.)g.zNjS]f1G4@&2؀fyI-V7T-<2I|'sj+ԣJL,@,O0ތ!F#w˚EOoPOO~ODwMi3_sI7'o"KmT:0|,Y}!<;KY_jW7"?a~1W}2ɂT f8'Um504JaKQgIn~:3wgY~?;onAm k*hɔޭЦ 5;]ol[Χ(x78Bn]E(YfEyy2{AOmQ;I'﫛񠪉o(ыodDIKj,YEh,nAM5Ǵ==G|oU>-HD(Z2{J-$-\^.=pc?\~7ϸw}~9۸? [øaܛ_}7ף/iuмjգ,ypoo7|Ƅr;N%T!?_|]K?y}pgO݋/%W)ƴ⬼oYLbݏj@dH)vI| SۭK[&Uǭ4{ >U[y~m`ܺkKՠd6m)F_LLݧsvȩ~{`Z7vcy6/u5*6t}|f÷z<)I]䲉C3>dk1V$0OKoEIz9oJUG4bҞ% Z &i<7`~3o.{5θ w'c.8Rz5ƯF0m{w<Ǜخ@\DpQ6C-W%Vo"_ң*Bm6嘏Do[͝5\XNe^wdZ/jLNat˻ I\@l+ɫ]/&)3(Jj-y³[6J"[(C`4|DibĢdhzIua*!뜤4jH=v&߹{f)K9㺝9͜W1kZ| M%"{Jv/w_yi>Q(<(=Ȁn-*ޟ煨c LuÝS\%QP8#W +&j~T{ZTǬFuzF}ըE3nu3ƌjM"Loe)P\ij(۠b7# Ы[ԣVgoͷ??&9_ΆL}I>(&;s`ptfU~c&4 UV= f 5A\mnvc#KZG9 #uڊ|+p-,+YY|cQ%Y^&MdqxE8 YRHbj%(()fj+IB:8\LW(Ve4s[\~Mb8n/:hq/x/'~jJlCqԾS97s"?2xުF:3BHshN;ӞN*|{sUG9XNv-F cSѝ. 9n:nzp!.m| -̺x667Ss[)jXWYkE ?`QC`ezK5Xo 1oֽvV2(|Ӓ v^ \ scx+pkWJ{T$xyCpC.֊vVp 9їz> U}Yz|;*JWrtUέ ^Fg˯9|zv6Z $:`BeitWׇʛ˔cçba\>C.ݕzς?;kߣi7gZ6,MEIR9§>9:.Ƹy?8zqw>x֐^^`|hm]MFzEO69I@(^}['>2`1,_E@rc~Op8|@xZǦu= ̔t3*fuy.FOڒcIY&j37r9OK1/ܸY4)M+c{o1!vFkvt4~|ym_۞}Un9I!:gdyY0U(tד>?攙bsjR*ΙT r+!\mΎ [%TS?%gXF РR+ME@E[%! X^Md [-hsb8.Z~"Fs$m\ȢUC'+}/Z F\ۧsPX]l!". & &%m,5Ԍ-pOKP`0]Wb{ ӢS2>k0၈&3+{}SBlS҃ 8d+d)玮C ǘKCGcTKz6{)oMl}G4G3Lx0"jڿ_gM61Tqhi/Z3<%5'1g!ZICjḹhQU(;{[SIΙTRUNRNPAaC^Kgጋ6VG-8mTCJ 0@mMu/3 9iӞ5T5x"Onj^,Ԑ"7Z=]Z`䀘9EƧhuz ub =56d#o+1YeOhBRdl@ʙr-{`Q訁[B\ 4s8R-3(ո05 "('έ XbU|u2h˳ı&6V7N}Kb )T4 e&HG_W3\"7߀ˮ C@p+5ƬG6lƆO m "3%B5 )V#8dKY2 Pf'ָYJkWH%b&Ć ,cȆ2=hEg;cǽƒ ei;0v|͠j`o]\? 2.[@Ci쌳@5c80 cDm )8#AI!0΄#WT.+x+97ud&< pC BР wf1)pah,T B!Tϲ oE"=@w/:_Br Z]T%! JiVa69sRE(H(gg@ e niت-x_ZM챵Ld3Ak^{1#.M5:f:+1 k jD!LP?gX9̰kgU9t9J5]+-xsR`cquH6=*P # 2<,&(XtG1+S$_{LVP XF S!yG aX _cA$!/)9n]G-ۀ8 U2YЩ֏oM>/6Eb܉ilpa۠&k) ] #H b"җ b~x_?;=_(;:tJ;u$V8U9`O#ϲQ׃Iq]δ97+17o?#'X7OL|<'w66?w6Q|`7u *ɭwJG4.P^HrksjyF*C=Zd*ءmy[{'Kdu*ӋdZb'vp͟`'`|t*^޺!b:6Oe|r6Fvck-Gڝm5ζfP'ߧG7WCB($_}5ɹpGtEWWևNWX#+Q#NU/th{W]#]i=+dAB8(7>@OtuDtefne1D*k}0 w b.1ctk-Aq oW x+|Ҁd|8kfo \]5rzi P{yH=ϟ}v $ng}?q5\T_O@ O@)5O\s}N`LE7/F;t(e'S }µ/thR]!]yS0O2.hDiig̖]oN͍EW~tj7кGP]]=빔Fv^􆮸T:]Jiҕ2#Е9+@)9#:Bp#B5/th{WR2c+ c}]!`{CWƻBOW3Ntut_U`GtuAkl_ :{t(5yWIWsYfھu5:TumV9ҾzߘO>bKrby^| 鼤7̫wًIQ\=zef`T(R|:;6^IBHdvZߙn}*)eT!~89 ',>ze]۫ WuUW*mqˊ1y~vם<ZD$˕ZU9tO>F3`/FR%AZzD(# MV ]!Z%ƒЕV8#_+\X_ J}tW]!]yÌS0,՟`δ ]Z]qno {CU?r0ZHtJs`:]]Yv=mV-WڞkETIuZ?cmf7VPQ"E_+ڿ}$?c|h7=w}}5D)jG 4=+W 彡+Dġ4DWHWlv7tp5g}+DX vDWGHWj+BªO+D+޻BZ]!]i#(1gIq_+ڡ;Sy%{lB=_z1y8=jgXlQc)qVvtIp5v>ht|t&Y-Vˣer'aWžxXb_G:Ȫ*UTyu-";x+5w=㴉=>^FOW 6g|.y(s8vjtκ:<:gc(9oD`:,WYPtdS?e|OmAes=n]L6u.>̜̔tQORQX]DB+ULbe]/pknW1^ RI:a t G!R+=cbxÁ*l|cte&'aM`ZIY'JjNY-` #5w]ckP`xWi3`g=`! .ePBs*UsJO;Wb3$?Fd_jjf Tr4'Ey'Y&6ɛsu%6<$$ġ@@j}u7+{ׁd|]%]XP[|YuҺV!K)ڳ:wCy"npv~0* YzcYYpnAFYm_01JZLj6|=.y= ZMb]ɼҹ࿖. of(7vq4CݣJe ~ty9M} 6c(ǻrN޻ N8{|[&v&jݦ9hX6c-7Jwmq1.#x9XCMeL :ĐEH@IHqXϛ2a]_27,zyG9@uV`w E0q2a'V$f".OfsS.NWs A@`WP7Q;l].gX%u^-2sr俛2;~3 -/WM8;dW- oUSNz-oCl,X2}!} `4E蚳 kAۋ0A#L/ ӫs SBVzl:f6oj_fFo\w䑻'wO+\ cYJ%:+ͷ]~wuf\|o97&oɨ,kt۔{)IndNƮ֊x;)hWFhe?\>xSh=i-h5 yZJ%7͛A[R8;WÌ>dxX,*l#$t>Of`\JBb g3,|܌4gyn2 Gv.\B\Ua[[ m -c4Q1W 3g)χG6E{Z1_*=ekh xpR@$~o2Boy-_#7G)8.*Ų[.nv8ءxP}]r1~ڒ{UTz_jb,N:-0%eJ6{/ühpE`^}D 3/%e)R,gpQ3u ^Q={eq:{`<^9Dak^V\ޝM4U^[w{yK/*{o-x9Wr޾f2o˄Vh.m Nh&)[M:Bwq}[G \]mꪻ9dz{ ]_sna5 ֺ}w5%_]| [q0m;[;{wUml%vSrm6F?G( yZߑ\KM^,\S]nԽavzEӷTBJI㙹 *Y5b?M 6Rߘa?/#ȍbGͅv/B:|]Koz{2):*8*_02bw)v?r " fQG\m $ؤӢ2$U.:,UIhO*{݅WRtI tJ-ZfF%JppI̅P3}opݧ1j2m]w@a>Lo >v[ .Λ)WfÏ[v_^1r 96oC"'Ɲ$m>.UNrٷ乮97L TRN_cpu Z!40žG?octz;(r+r=BɏY>Rd~eNx#\?W{U5CQd3E=JOGY:׌1>/dYYcS{p&>yf+S{=\HCbUVkmm7R: xHvl*+b ?8;ؑf:IZ@#oV 8R͘FwuwW_f1&d^@e"CeYm8۹'!t/~,kÚȸ ?6觰'uK5 <ŔR+SbLK>MfpOd4 YrI0J:'4#7j[q@΂r3pV6;e3`7ɪ=e)K8E%reYm8y ω]CjH r{D,niϺގMUoީ,;ZS6SeZ$G+,1;lI|Ӵ_PݟtuXt>KRBĀdtׯs& yɐAbc&N& dg s :|V9R\wL0NxƪGv>ELRdHKX/M7Ro0eGՕ省e`zuٞYk[ƈ9;Mp2D8} >mKp\HĜ,e]k* l|ƢNQ%%#",+THƳ&%M[KzSʰhLBdAyc<RfM< I&A4ј Klje6oUi{A8#eYG6wVh.ITvhSvF:2x[h Bv̛d4;9Ҝwg(}T0GL֘Qh3mWRiM`>U ֐ ޴wÕD& ֦4~cn{s̝1>ruC#PB#F$i'PI ]$#EXUO",&W.ȑeIbژ"Y d4,4)1NT%59pv_I|{;KQū! r+Jfe]><_.]>QɱcG'bG0pK/ɭ͇ (e1L9JN4^kgSkґS30&Vge<,hɘ ~. y*3 3tKUUט F2)2 I&Y)i8H24 ېd6>K^> pOJAHa reZ3j\vEMףvhi5v๻OSMNYf%mѧšX<֕׷ m ro˙i=E{B[{827$ȥM<, X6`S2;hb:]WF'UYp I4$]K䌮$p%l2Tmd6XTj/Be, #*6y_tgv,>4Y6ٙ#Li` 4&R@ ,z۾NR(aֹ3"#=bLx l B0jd1c3MhU2uj g;b .դc_Q[t1؍N8k2&PƗn,dfsh@DM"$h&I^d " !GH&&$0Aq$ QFV(UUjَQ?_,x.XM>ED!b$X2Ē`ɔRWR;Ag͐X6dF2HCVQe^8^7DqR5\pG &"Ȓi竌ՆC4MW LYMJEUe.n(%E`:ژI2 -p +R/(]sjq.x@X9ūƟƓ#z2&/nFC\xG80hCgmջAj4ڨB A@Hqjh" QL&BYBJ8Bk,! y(rh<ahv/Na?i⮾v{=.Lsq4 7ƑE=Δ=6{DЀ(OCGF?i>"f;wl(KRv9#}~e_y-Ak3GSٙoz}qww]zGCd2A7>A#bcEf һ`x=10y6wX_R/d|ۛٶ?DL/d?rQ.^uaϤe&v绻oҸ.h_]~]~V/A?.Q.'aڷFB=~35c~5ζ6Jt4\е~F#+^ׯ+?ݥ]8EǿR9o<&s.8=tp.>t/=߿ S{'.%1V.E]x0y82ủ'KD L0ͮTL3sȯ# .Zv lﵱO#7px?!g2![|&zJqbLdؼ'b3&ws!L #rg-\>pw:qYh I:ЫzECn޵#/7M|e Lvobr.i{#K$',bwKd=Rۖ3 $_YUqz{s4Uec*CO!.lXʎw,jz},€~,.ΫWbZ!hK楧q-ƿ5LjF Rψko?'zJ7O2ëOYXEq*ADJoW6U{ۼ"wz0Nw(Bj9*I3?naO[x0΋Qux08id𢺤!SBOiTi.p5⢓|TqKG Z n*jIF}=jc_Dj>k7zPjQn_g)3 *  3ysZ5ւof~i[YJI;nrζyS<'(?&Gf?J²ˁ~>jdHV1${=&ƴ8jSqQ~RT?Ky^GQkqD<k4Y)%t nB> uȶ|ͳA~ޒ#5 C &yq!F:9EX4ZsE=Fj4c;dÇOr˷q'g?&M3z;dFV qʿX⻿?9/|G)8ǓVPN?qz2}~|Z|U xH 4XEh4RKnbH!1aO~?` \R%(ޜoK8䤩da[WO&gIAM%-ARVkBб/-CT={11 '"E r)`e,c&2f4IhcT21C3 :THY"!N@mrvLupZhQ0'3qjSPNj By;{ES4QmCHdP'g2 4v#mvG.ܾʹIаvIw&oX֡JDq8"LNqק:ٶ\uY0GWAjk}}.z^hƣіfOw'ow1_w<9Ff_oF߫e^- ͚U=mGPT=۫A:L05%Z}bڻ.5\*.7 hrc}|s^3Ѹq< Uֽa"ٍk0rzGmOdt2=4l7@AMj0BDIå5\ ݪZqXɡ l~VzjQ-oq~eIfnWSԄ RW3Nu@SLM=_N _7\qt{zy # \Ko̥kz4YG'-OOӻ\#Ra$t FKh8ɸFΥ)HӛώgwYIcG^GGU:H-"N)* uF|;wQ'*棖|=zc;ĩzgl8C_t>:xڟ̉! t,^sW27owo:a{ˤr5Dv8/wY:e7iϲDړbK]HJ=.  4q`wAGkȨQ:6:&zlX_ƅ;>`~(\a1vy7?"9mI𻫻&״y?8}n/y|R{5]t6M|0kj儅/+ٜXM i6ZV[i^R1A}2Tڤ1շ62EWN5,L\`VJ 4rA2"`$0/8k?ĕhjK'1wV{6 3?/ km{BPOpN xϖT 7iW:WP^[zڊ Ư̿H념'CZdQW7ap]b-r8Xĕ:V:C b:B=3>2O F'bЌ?F)58C!&pK\eq.`c IJH$BzF%ѧ $zd$y q/i} BE{ D ].io$)fFe[O[} ֵ!! AL->#{qst|WMoK9A8P\XRC)x!KYHSǙO3^WG3L}83'f!,̰>>:DPVf h8[`FG\2r)΄dHk*=xj"D2dUX!(+rUNFbPMOd܃G&F2bAhxdZ4">b u,D)#q趒"q"&hA<0Lj!9S2G%8YZyig P'GuL7(~Pzsm6Fl+K?l|ܘf>~̽;W'O B@] WJc:0b⾊̫8ژ o'۠oĠifS g~_t YBKV|7/dP{6J?jClN󎕸RXJKZ78ը(s8ްGwwsw"18ܤCe6I!|D(DXDI!x/yμh)eSبc1*Ab(koCP+"e@mrͮz6v6iܷoZCJz9s,(c@2 }}NĝgDŽJB:$㉱Pʺ(Ќd| U( |p9;L&Dm7$hd7n~7G#9-Qif4!"^\=j婙'C{M-s+wzK̊ʫg2@78 s;Nȃitg3Hv˘wM_(֢3 #' -&xSza 3r#cF|J6,3B I­bDݦ3lW\6gf058}P'Fl'U4*DbT0,X"p )62pe%ESAULE#jHdxTH'K!#a:0-"NUՈmGCb jg6͌ڴv`NiY BYFpdu2: )k=(^BHq^D.qV?  hW) yŰ2EpBhMoL{)c;!lH*j@e P@BrہZ Z5@ȵʒ?wlv{ބѰ&Sqv#e/ k%XYq2x gdɨڛ4O@_dL%9j4j'=?:7IڸCVBuʥ%tψb`+.P }Apg.Ȼ$*m.&Q1䄥 ^*m1 "Ƒr^ywۨrd+c mẹ2y7n,Q0c ,S\q C4V{28F#^b,0ir4(FtF"T@`.S NaM:eq,X n7n=cBVʙߗ/er8CedzL=ĜJoۈF qBP0m0Z[L- :eȂoT 0`V|` "xQu?`QsƘW"xG ͤ(re%L&0LQ0BfPF;({s1 t~r~Lvx ӿ}|BcRd=/Z`1́'c5x,eI=DeψX/ߎ/*P 7,NY\^N..P&J5  `E nfz[~$#-S^4ɻզ:;2zJg/-/= *2C#X'rAF`4vQ6ZeMv2Q<3LO|^)!f(4x>VMRZ`#,rG<3* Ђ< R?T >8|fxVFzWQ*\#B8'0Jx9U)0"*mDiQUPӔ6}OŪW Be_g+h7t&IZMҪ@Y^%x>}g|39|붽ϯ*H@jFX-"VԁmV +U MA_^KDt0z]QnYf]O>7מIQ1S}fՈskbRO!F_aߗ_u>~7Lw?MWz&ȍG~η30mFU:T} VB?Nj_Bw}ۏUH@gMVN=<~?o7g՜qb›h`ϷgQhg)#Pgٱ(EdZʰ n'ƍ舝 Fp Θ>myDHr9l8u^.Mw%7[XwG0yG0~FpoߵD *[d'NfNf\P!ϧ@jjѧE:3lCP Q$j4 wrMO*]>v:5q/e _]q]xS<{+u)H`0ڰz:#;便XCz\40E^S2~fj>qJ9v>hkjP_t5֎3ڶgVݒl5kv>63:T߾f4|ҮV0ko,P\B-/:.&U7Ϧƕ~7?ڣV0#)fFZPaLnҲ _뱥e7h5f<u,!][nu O[+ڨfk}hd>者V fZKjI#R@z0:NKZђ1Tc,.li/6}>ᇻh]<$M^s_M_q"/{N]Y@T߽~"T탩 > kzm}7}UmS_~ƋJQD'7?=R\Q~G35pxIpn*Z*p6Rm9zRo\kPjG+M7y=1I1>֣6U7>ۿ{|>!{|4}I֪ȁ9"VX+r`%뵨9"V 09"Vȁ9"VXA"VX+r`Eȁ9"VX+r`Eȁ9"VvV,[s6gq4En-bp6~| O5 -k*C6^ާMk |n䫻9#}x@͕eNF]4EEY:X#x_hҖ9o.~4l[S7÷Z6-æ}0'lgo ͟غcKamLS>{,x>. m\36YY n>I{ 7.DOz1 C-P!sߑr#r^h/_cU_ۄy&Z^әמ3m]q!6*m48u~A:UPĿ:g o 4jqu:n~uiL,JEi(Mɢ4Y&dQ,JEi(Mɢ4Y&dQ,JEi[Qd[STSE _w[g}Q RN8'QjV+ _1%4 1Ha|Rh0֩mIRScғhQ@⍤>3HqM,Zc$o%|?Z]LyK'gYpjWC7w|mΧ']:C&MMa5CQq:EJiJ'VTa9j*;y-Xhm $heQ9b0XJB8ۀQELf V|d4ƞs/־?/ydh ^Q31K,ٙGe#$57fQZ k+1!^!x5#Dz׻̲,VYgò*K?fr>3ð^AAGJHrg=0h-QŅw;E淂Oe  R5C5nx+_u~K󴌭]|"EZOyy/!@&wR: d>t>Swm1Yݽze ٔ*5e T2bD)`u&K%mRQ 5iEJEKF2HZj6E&dĒ1+CAU`"UJ ,ARLV !:7E4ɘ""Ƞ(ȇ%S,u{gM?y淾xV=nJ70NnVbXӧIt)FC1q~ܱj9 _MO7zLZVvޟsw ?ޚ-խM\$0^}ok$vZ^֏QuS'}Ǜ׾_[) [QR/hd2]~ƫ_~^F7p+=sJB>jV6ftGOGSoi忞BVԒ0Q 0v}:3ӕ k~{K\[IyB@iEW!vhux~}E5J3(-=#w4ٔ'pޓ'}c-eNzшP@ٻP|6T}uܾ^{]#ֿrGr" @pdH(\Q爮U))k*ܩMܩW9L&N~zc{SwIJ }'ÉN̙*h>{:Z ^wCtgBk=ZWkM"xH!FdmU(eAT7V"% JyR* YS>zDPǜB(3CKĀ:Eecf9mg=~>>}^eL%e~HTWL}tbc֐6{`.|Г~EՔg*[;y:!T t8o4O* ,ԫ Wfg>/nQyj9ڕ塆Ko*= ?B'rSpC _q[kwQ//=!;زC%>}ps9-f?\rqb è4`𘝑)Ʈ(Rh;t佪w30gDO91O駻oO2 5>\F_Hw >L22 ٣@2Ȣ#9P*El:'H;}ɇ6hh0vϹW^tzRkg#݌&盯i|+&l~lnoO񎣅:喝?7{-3@pQ7+855_`mGNGm2‚>b1h []iBI`? 0)ّAE!Q1~ LS.jzKog zJڤ ˘ $1C&R^{pi8kn; ܓ6s>߷ocnkAzǶw!g޳n. ]`:vEC.]-B[P`ׂw֛x kp @XCږ-1Mߘmuڀ.9ِ֪cYF\hhl>6Xٙ58U' mpX/X'z}2A$(;%BL $|2>$T%Lk>y*k/@I#QIB\҉PaBP %( d mX_aEEe{ f}[Tuhq# 7gv=Nh3Όͨ25uҺA' L- KppT}{u8]tW3kT9* Z(2 N(aP@Q1Z/]gb dMQYIzB[dT]!1 o )[]f93vХ; ] ?&ZUk&5c|3Uw{|M3@ɪv?T0Q xBxkoJ:c>]/$\ =Np3B7]Eyݗ?bŀfyaQ]*tWf8Jѥɢ0R*(dErFH6UEJmNIilQ:(ڦ̳"5B95A"Di%JRf9 &N'uLz2(Q&m8eS{$ژ|ae m43[3>(%u`f(,2;Wg Ƞym b~_` ^l ͅM}~!ρuEm,2k)7A0[%+U;_̀E : D9&5VkA;>ZcIAsJF;)+LC@6;L/Vdg(H ,>"a0:QtLk*\|>(!jL#Um*-k@Dxͯ2$J6{6>>j5ÊJRnʸ>r>r}䮩IV?4M|6ʇ=_ #5H$s)- &b(ƺ;3bLZ0J3}]ɹ(jQV\Tlz(H5*ʾBVZʹ@(lZ[f9lafdj Uc[[xV[сsE;ܞl3{ET~AF⾲NΗd(uӓu@1Q"E<#Ãimo;u~ 셬QiUͦ.YSX Si3[MŎq^bIǩV[7zC_УHʈźP*a dScF9a70! (ƺg< fIc[I$Hd87+ɢB*r!4ͦs~yY XfqE8X}iAXQ"Q<̄>m*r%`AmYK!07&ZVՓjJ31,Y4I+p:8 iRj:GD|$^vqzMfRr]`F!h%HH]bVA 1NFہbMjN)1 vvZa3{hOg0a [ҿc_5 !BяZ9l52^뿾Vs:҇1u0;kR{A l<.8m;ރXA)~9_t@C y!@H-'v@ȓ>PT|@JfLZGUV)@,zrhJSd@Ek,OT1S( iQE**5Ia<lM@i {N|GOϣytwIuܬ[z8'MQi{%# E( Yh0D&{&G1h$ nFg5Lr~e^=۟s=/uTuٌOX79KO\-ww!?Wrmt?wiWTNT=tZ!9ʑ9l >S}aKW'7ST-qަ@s\ٳn(̅(1`9~@w>x LI3e F;r#XCe{|$uΈq]{ :Q`)DW)w6tia`j3#]ؘ8'TJd h6 bG6d3Ֆ,EduYVj+LMrY4*p$%1bTx11'PJlXs)B-AbL$-XdBWyɐ='M H6KvVf}hÔ }2X|z:ͦ үp7{+R^ CPZD[j\EH*ڌTB섕QQ$K=f/e90OçѠ44]*Hڳ lBFioGJvA@س;Ǘ BVCd7MJ dI7K+ Ȉ7mTǍ""ٔ\a:JoV:]BA@A"5ݘ~se)n .;ghan̅99L .va}^ueVP&G#S_6]3vuGնG_⇫G7fx=}<\og=Ywd].1/z ]eKBYY}~dwj {8_ a,8w{>z}ymt.;y-_&M]ذvvL[O叟μ>;YIl '^L .BkѤ"Mٛs?'BnLmA\~{зw9vf_,Hh^>Zj~$#=ےqoP6 D/#78'}!z;j8ʻ:]nq9Ho;6Қ}5PLiþnopR{9\I"Թovx7.qDČ6Z/b&Ac :`ԞW"~px{(%x +G) ,eY( 2ieBXe#r cckPQ@mx`aZCC283iGu%j[D[WbAm(y 16VeқD%`G5i4ώ}wc[O[TuLr抩˔ lʄ1T܏i,s!21 U4e!CE$ActNeJK6@ȳBnTzCJC+e2a̷!0c+!$B Lh-QJ'Υ*"E4sqČ:M;qT$UMɷJ uS؀* S r V@@'ӊ! jo]B1(J8 Ls$eAcQgGx^cWݣ.%e L`3/=kW(HH*+9ROQqPBI1YV "ѩV[4,pZxP>fR3&i°A7Hmne#eۊYUIB1 D%mӉ~ߋ^ge06第FZp 49ac[-k.Dj#ja&a%@P4UCБJ'. -VUz|阌$;`ŗX~aByPr ҈H1DN W4F僕EJôL燍Uc4nh=ŋ k u<oĭ XTuvT% 9U+ϓW1Vpۂj2P1N+<}$~ѭy'S` X M("f_[b =N x@6DA ._GW1* Feu $$BkLFQ{p!YbMUM9GՌ: hX%Y֨4jp=T%@ "mQ:2Ggm]acm\g 1,PMspPaRD,THs1op <%\rI,Bј2!SA :G DYZ# R҃an5b)2'˅ [W*ϕ+E@3X)oG]u63A v*vBV%FJҨZ1⓪`bg b"Zs܏ܤ "eq8Qc@sJ6H0s#SJ݁ԨFzPpkO'EY,J@LBH-pՔqO2ˇ{HT)o:k95*m@ Q'XraYɌB^3 _ԀB2v_O#4%7aDD5>d8֞crI"VŪR盉`ebBv"JI-bJ*"pB%N 踜f LBF .p] Ay3!z c #jWbV۽q>ܡk+;]ZHKl{FWKp: ֡P7Tzf7@}@7? H<HBϳDV0= kI ,I Hd"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""%P~(HVCgO@mI X)@_" d8@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$A   s|2$ќϞ2@_" Ԫ2F$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@_. ")@0ػC9&{2$? + K$<•!H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H/^>`k&G?V~{}\PڝJɋI_.@tK7]]NO}df9?u>o>3\we9f>I Q=:k:Ӵr(gY'.ٻ;-f\u e İu3_MoתכGW%GGvwyC:ԄlGiӜotKFHΦ7pph㈓}5wab'mͺmg?E>©01RaY38Vd )juvX<X[+kѰ؆e4,!#3dYPLI4 ҵ6CÕg?zmcG?4ܴt~>8~zfEX!{'t_y-:.w~g /|?0ѭvgo_r;rG^+z~miVIhnoBеiY P wQAA]wdLa+ytE WS*wU4M m2M#h/?ί;b({1B='Wc뇄HyuΫ)#DB<Ar_dr?-8>m><|j/3??v lIoXWF.JRJϋ>#aNʵBYgh齳M{zcvJk\T3~ׄ?p(oSr>[WrؓqMщ㧹W=.hMM.񹦡CVHo؇~54{;z;zȠ^u'][,1$qJ^Ƈ{}^Z<0ԬqbɕWgoxH[iZ]Nkg_^\r1}W=[9ք Yfݵ2jV_ŎXsן;sWo-J&?zfhR/O/3t1*M -K>d&#Ȼv[:)eDMC]CBI(eIuG|k[I=^ @]\͎%]κhԶvJHi݂ݳނS.gvړlIi}*xJkExz?hхSKLAbeJV9 bC3Ģ$!^C/tC"6,lac^V]]L9ЁăV'JԂq|@hCX DL{ yv 㛡D-fJy?O;G9Ly9T(Mhg"C@[8:Ѵe 훎%?FN+@Yl<uat']Jol:CZa7stpZTa~0Ol;\@,yWc,KV{W,d Dk`b,%qR'{Xn6,%@v ,tM۴)̳Sk߾ LYD">pm?p8_]j~F\~wA/+kY)$kA_WNIb==< ָʵ fkqemFߨL +̒,H 鉸(U#t%K=<438i.gYzBNR, !(oXƕe$D@| rㅬ3kVęզg.%Ks/ld2iNw~nzjzsjX%:&yBK3UN3IL*J@&ASZ!44R0lQւWɕ=1(xsYJ#Қ~9wn'$$grֳ!);cu,=: KieC(lvSvhh_ɩ~;BCdT;xƔQ (M)\Jh%01\RGd"j46/e6޺:.ЁLuѪeqWK>˓NobbU^!:N_TJtZ5FCyTAǙ$ c-[EyGYZM+}]O(vقo{L$Uos`M.I#lW 1@COn缱:wC4@w;6?'-E -RHc4HIR8K 1F%J SxUXepEXm:acv QVfnG?I1Kl#1` `Eʤ́JlK i`C]EUV~EMד6`i4v'Nڶv9c{Uw]^[]Zh6yl7綛bZ$nX+|ű=#z<]v{IjGn(.$W.qY`IHzKYL`KCNN MU-cV-c췌J5[XM2PBo j W3*6{(ȸ#9?؆v8Ohp8:==;MeΤA;FFSsĴtrl0%u9ȒCq<!b66Qҟ̀#{@.tI:Kٜb\KդP-*[m[ 9emtֱѕY)hHĵÔUa騉uYicC&de)![E lFQ!+*+jo[~Y XjqE--bowIu%M J^Ш˖Y$>f:F>Y ߸tE8\LL:=mI4.cER;t^fIiPKuVClUo{9 x :LI(qdZHk ZcKդR>.v-]5ro4]l_r<۝"?QWz؏ ;z..?ҫh-el}*!eo5$G,K'Be!\'v$"AsO$$`KB&HI@"ȅР@+|&!s0* A`b(HeʂۀgNHЙ%$oE.4MDI𱹟{ ֣jl_aY6y/qYӹE %+JsLeXf J u &@u?2Sήv=奝&1BU`ú鳴w=Z ce-rGƣfK pFjec 5G0}{Y\Ճ;nq"q'ps9_yUإՏ+>Bu'v;:V׋!l?49B@Z V8aR*)U>%_䝋K85ջpmœɅ1FGZ.P@CӬ.7* Ƞ|ゲM2:Pf+sxrF cg5: 8omC(F5!Y-*mOD& 2e+B2!y렀䘌h /]Y0Ss|qGkv~'qi~ kľv .'ow2KN:rnQ씛zO2a%v;c΃3lߑĩւM2M&`+2\O*_\gu(%]='',(%DF0!{A2-9$H\\YyvQf$^6Π<ϖ펾-ϲ]uzVe![$ UQ X@E8SHH&8xr%Лҹϑ>E(}0,YmW }vYẅ6 S#^q4, ӓ;@IqD)tJD6PڙKYoxdv4=,CօG'h&u#VdD ,(Hߐ0)&Sy.5C0,rNM6quf4ځޠ uҺJ*.%& R9K5: zN^m_[e݄:ZQ80oRoE=۰tbyod%TÈ^r:LsMSE-sa2(KG<"+JؒwA,M?N^=m+#;--5~6KwMOgA[$wJB[N8WRca> XBLqqw_?q=ޮKq[_c? #UAR?Q˦zs>0Q.O-`fWJΐjs#'|ت]vJ/}35캬}K[=Ogd\IsC0:͚ߐԥk"}D^Sck'qPh&k^?<7TfFxy࿟>oQ`q:,TRI' <^z?ǵoRr郇!lAZ3/~tFKw?>L3vIsE~thThOQ|k|GI M$6ŕʃ[L߿ts|iҟ߽NOomL>X9F+%lU355ľC`XAcU.RyλcH@:i%etEgȬ gal,rD^W$0 ]JujqhV^?+?e${&gm +B\GЋYĿ/~ic2aH .6HJvS9d̂5QxQ}+R]E:(gsuIA YTE`Ll Rp**K(Uc@WFq$" t`Qe;,ǟ5-aѴ/ifz{=*An1,il\F֌b]6B1ͥޏoI痚7FH1U>GjvwQܰLi n||-_k5sX8Eqj |}犾u;LlDpՈvл#J^.{ x5]觤"ez$Er5hy $69͞~TUWUW*,g[61HXRXjP13.|t^j f$7ZRFXm|\` EQ3v:5+ FN[`V:,ˈH\o܇zʈ0Y;eϫ_)7+3%MgfJ\(mi>..b\$Ъz4h957M> .#^{0u*JTb90a53a~3=Njo)9Q)SP&Q ;6hP;Ԛ f|DD"{%MTxG9ss5_\o?Q?{~F`bI 7:h$[-c BHZ4y*ewj/Գ&έu:+InAߘ?Lc=ܑKD32eĿ*hv%ʤNYO]Og:m{ޡU+it$EpvGU`I۟dǥyպnzwpm׻ݷ[=r?=̧=5ORt߁槓)䡺{nX: k\f3iWh{zgho\8B_ls]~^ˮxރs2ZLάfSB3!}Lmw1Iڨ($"HnjJbq,x 2E+vnNR%R4{_>6z;bP5O-䗃k8A"5ZX _VTdk'0eh71VYE S\&Q4wFm6*GT KIa`陃{aީg>N{l Y>jkYmʔl+˲ORt,ň ժjf\9nf˝vf&~ 똙Ut%'4qh5O=,:u~?LNӴF\0s,_gkqk2p.'JB߫,e$6ljoDZUyp^?lXMF絛JmUA_Klgur@4~OJr] \UrUT{ ߈<]GO (dV'j>'so_̴9s˭#ĂcePOeR;]ȑ˝V0#)6g'xhLOYٶ3dK"kY"1aonxȤ"(O|ʉ7fk,GPD@O!.$Hr+)L(+'bO@k =}t>ѻ|vw=+v):憋ېM_l[$e\5s;!dh7S4ȕ015!@@])zCi(gEjTk:w6=M_Wب<0`YDGz T2@*XʂN1ݚ+h.p{h&A0 :e)倨@ eTS˽Dف~޻xə4 /AO (L3Syñ@ gp;Ϩ BYj%Gaӏ~=gxbv5r:*syAF)xbc{:*#a*R /Əm@q2"5fwXÚ2XNh*0 2 3< nq'}}XipRw%fSXNʨGѴG ^D6ܵ8~R)_`` Pk]7얪bмz39/U,^pжhbs=BV&T+˵8?WWMtߊSW65lBƆ#>E{48߸_Nr)"nGRLd&Zwf2yn^brx]8ή{Bنզ(;;BꖕAé~67fAiAEVʯS8m1}k׿L?G'$p|>,0'S)@>jr]7(/׉gt4ቇT5 Y- ݲ3h1oc:"Lj,d>}G䳦GN5c> 9;S\srƛ7.0X f `K쌳,H'UĤN1YT<aHIqx,/,*zyW@!;$N '[$8cNHce" B(ū*FGUByfJ(v1}6 í? AD - 0ϛ:==dCLP(\繕AK6r+|N+5LGS-yG)QD#RKiQSeaGi`b1$f4_/Z=]+/^S;p s"\4"23A+pq %$y4n(KT3WuYƙՃ}6fB!xHQJǰgI<&DKkRÅc# ,䟀j?R*,X:AIaZQL's6H5Q`㨡 ˄eR# Ng9RcDM`:= VsyĒBC\w;A! ,= NQhg'wǗ"qR鯳"c^\Mg6)6nǫn4,ԇneßKE6&WT'2![_gfⳏĿ[*_BBMFT FK}SS#t9HlZ>/-÷X_Ӭ=1ZhnƾrI.""~.aC ;.=%Ť _d't^j f$7Z+  ] 2ѕv|?*p܎Xoai  ! rPeѽc{P[&rUy4fi$R.-E{|Z$:) ZN)9C`+?2:2SbɛJ%v=s Ӯq#o 裀ELϯSqfNT`zUT(F~I!-RkF2)"$ IjF`X4Q@L9ss5_\o?Q?{~F`bI 7:h$[-c BHZ4y*ewj/Գ&/{ߘrc1tc}:ܘ?Ӎ9EZ>H+pG.Es$B\ 2 CԡZVK Y@!jfmPTa/qBōy0BT B=8pmQ#bG Sg}>w=->ܶKz0mc{Va3U1a.&ocZUs#뵮VV]vnjG~pZ <=K*s槓)* T@uUۉfQ$fMӮɡǟ&jDJ:i,6&ۜ IoT$vw9͍R&gV\i)I>~Ej$ZmT`ScFq 57S\@hÎ١Ijݼ4Tf/KUufwC5zZ @T-[훛밂]!Oa-]:2?Éb( vojoO>N'-qli컂}}CBY[oׯ*~=,%}oվ[;Rڃ=hZ_FqCU(ꥦUߔ9K:ެ>ᗗkJ{|U 鷯ޭ*9-~k5ߜ Y-YEY'EE w?׉0wUw6&V'X|Y>~BgDUPuEuyq3[$-r3 ' Knhͭ$GoCAC}?_ OYyIK W69>LgKYT'nzEo FU( :_g3}G#<ӄClM@r F6"2Ay e , DVxdV#nyRxjܗq-bfvoX(7Ԥ)PLE8):x0bE O8"sQ1 2r'♪IJsFQ?8qxfA[LiOR1:B蘌qA߳Yb|0ջ|WS9zϴ{b*'A%yNEsu*`2]:eJ=>XD"3:lMTJzaF-78T>)iJ2wP H,U%=WEJunWwhWw#418#X9gRc R !s#1`,Ҥy>t [ l _p^LlSڑȈIC0Iy,&TWbXFsB3 BYE((f\&:-xPI0T:RDŠZ95ǂz.)&'Ye%ǗRrYEinr~HCJDa<&& H(:-Ƌ+RBroTŐsql :Y1H*Mۄ U5g?2*ŰdP “bJq̌67ive=牟J<*;>>+)P7uS0R/y2p({%x4ĩR%%rw%*A*eV'fM$} i!v B4b' <-! 9i2$L "Gʘ `XFHVYitTFxr/i vxŻ_뒚hC~Rm?ת̹~Cx:8x08>xN60'I%QS(qXGIs4:G2hQ%܌UL$Аq T$RX@qT9*x pj}錩Ś_k~$(s֎LCVosj59k[ ,o}g]1d's007JK#AMF1T54[bpÆ>]{G&jxNg\Ճky`8m'pr\/ɼ~k;N΋ _z%8˥T6@R 843:E8 8:Hrq :3Ʃawvޅy&Fseʻ0Oz|B,8N&U£ p'leUTN.j+s:/_ -wB:TAPv=TPC2gKVb@OEA"&.(Ċ\ F1StH 6F @ 3ϝV;(C%-94CBbi=,Ѵ3K\9ȁhm\k4 uevc?j'Vx6;P@H g9/u`G*ZHD&J5ӹ8VGΉyK!EY:(MLKt Kt>by3\M>)zEW,IrC)UXNu&("R ўɔ r eA?4o:~^l> {y:v \dU0P`#hɄޘ!Jw)84k{p>˫`Ugh$4)@[Lǀ:2½49+M[6j\=ss$.CѢȪʥX'w2s-yt7J,쭝VسaGjdheR6Z} G1 _+M$hYJR =93H"pF%\ICh`EPk Dxy`F'jALEٗ@ vYXqGroyjN3_; EQ*y0Rb*G5VQx%Qi_iz-sByoja-Ow ]r6޿) z~z=^ #Լ &6ѧ<{{kR&p9xgPn6,|T$_F?{ Gtzg`+~@w"2BWoj44Yo_osx޴^U*hX7T{/ֻ6ޫj\;O>-frLߏo'3z|a b]߶_f3̲#s#e0_{DP\ B QW^k"V5ɚ~KL!ܦ*QlӪ .g9QHsFVƆTeæs>1H87ft靭^4r:_jvmf`?^>Ej0MқUڋ8:#g̓/FQ.W%d~mJjfE?<'.:[. #]E4lD@HXS !c䨖B'~_Gb e _G|Wffϰ5l0G@1e"৿7OK0kF !̑#}/of0PжUe[ӵR6״I%fzJO)kt;1>)hd0c[%JYXVB:HJ.UnkLhg>J E ,*"^sB3L\z`.sRՃ=V|$v6V;ޭDe!&"zi[1mnG3~t( '"G(UŽ]kjטQ 1m'[5WoΕFQYo{40`_# R\J#lL@4 \=M,H&(E}AZڌy9 _V/GߦU:"=h`_k[lA.8_>{T~ýqq2M͊VEN,%%`g6fl4V^tWoaן,QKw|;pޒY[|c%ma7gh>wlV>J<5!jxR1G=q^FɡŭY8^ECj?t#CSu*n>XvLמ/}M~&0h)O9K}yEAsTiύ=MDKǦ=9me9ˀUyLoCǙ>l=TgrRd9ȓk?3U3-6 q=2ϸ8P/T.lYH|fF0}Ju:pڼ4J=n˿kk\ZE*]K@}.1ź($Ts1ԏY0grfqk+v:6Q3̞J[͓,Eb`w̓hޟ3v ,H |8+Ӻcy3y_x=usᬏREWsᬏY{=G~%RƹpJOD]ߴ kg-NS~R =<5"={UӺs vHŅcy~ӪgqK9|`ki\bV:߂(#}>(Wq V ~;z}=1>~kO8?|֚FIZZɡDo.7gFֹ/^R_|3U?GF!2sDMvڃTtA^l|$ >J+Bk]d)kQ|3x=֑[(Phb{ @V :[KC|z-p8vs#7J޽ ;7j2i2W*o8N/:>Xuhe0쵛?`kV0cE%u%yjn%5Jmqy7Eq1Sj6cUVp1e(^0e+w̪>`Xa70aV0cyh^30ȜgI8-p)'Â4nI_烀bf0vl7y SW|}$)U,Vw%PGV~UNL2k':E&mVy?/Gi{ kw?<:~z|Yå&9Ւ,BD\Bv/rLVv&leV ,6G}T9e#iQc](SZ*۔U֧*U*MT{S c-hzZ*7(CU! :ׂ ͨ7[uYQkѵAǨ:rŲ?bėZԬ )7U5K0YZ,y-IH#$\KS}&Tg11M)'И7:ZPtFb[) w'pGK8A {`Ց&:f%Uz!lKBM;JJ0T03!MBȅ0ИUVZmv Zw?{z] (yd>Kohv ]o>"U-mI`3<#JƜ 9XiA!x_O\|x5AYUqyk% dhc9eNJ} :U2)a6n$iE[RHHH I?7*|0 4fmDiivJcQ )Hj%6ߩK" ڒlR\=D@Qc1zҼXt):Q} M8>E-5k,R`榤1KuXgM!(dGoO$Q7_ ٥fِ#" DȗBi "[ݪE' {'0O4v /5h;i\qFnDM2ʃZ]]oZf kkcn5+yXLĭd4tŅ5 d9uF5`ⲦP`0k 둔YlXkYC*Zx@ %X_wMTLT*`I-RƌqcSO[5lBUSAJd2b@d2Gs42NiEXd*zMB AvEm%˲fH57]J 1u4ڀPNiPY0(3mގHLsآDy>TFݚxbΊkmR7 tO!?%1)/%J ufM0*Qkq(9mkL:6_ UBmj((Sѝ Epi7;KPA7/uV  ~j2 RVW.#{/hZëIUDIAPK*եed0)(Y" 5$DjeT"Z8J(46A j cL۽@VQ9TEh2& ԙyqr/=b_*b_1k&"9ih^1&b"yQtbB6!eU0wt8ߢ^ւUд3[ʶB$h>Hh"fN bBT8x0TYd :!JrJUe ,݇(yCs ~XQ|IE TsF<@E&rZ5ȼ`|ڄ g8ni@_!/!!dKy' d}EJ`Ye̾ ]{ #BK|u)}юA~I /ѡB-c `bjY7P@R@"TePw PJN%l-ѧ@hg]];V@@` )-f1cMm!I5MH#eVE̓By[@ ЃH_E_c~Xdי4H@d(/fWbd@!JP;8*j8Ja,*&a!dD(3Ae#@ A1-'J sA;}tҞEwR4IxTF+Ѐʬ.e(mJMI/UA{9 i ߤ.AV 4/Vq6Y,~aV6mzv+ F|G?䰨+d ԭG7H7۬ID3 .=֮Law-~)8v%[Y:Zm ݚZSRP'ՒFCoׄ bL L9]$f|4lDʌؒU <%2 E 9)lz@'3-h'r%t A2 5@JAO!Ƞ rs!wkQMz1l{4v"*TO[WL1& dRpF 9,RxyŰ aܯP(ˢ#Hm2:#&z!ANi,:WQvi#*`3j hN-)hc37 3HkҬUI2| RQfB^& &c!T Z W-ECpsrg>f5jS!`jlHJ؝5z X[TqcNZS0²M[̀| =ʤu!~)3"Q%IkO^Q!,6\ ]\1F!*]kJCqc I}6ՔFlKC&bC1 0&;W!T$1DЅKOq$,P[U~:ӯhx;S9wL0H5[Fmu7VV0ПFl{ʣF(N~3\!` (% ӯaU}Cw_Ҳ=ނ(m.h9k\|umx~T_۝Ra[Wpv~,c֭(xrvgSC^,ŢoOjf󜗗RcWm><:j?}?1Ng?fv}zr")OF&'ѽauNu?f>>;:y:`E<=~iC>m{#mfb6nۺ̶[Eoiڿzke~TS#x G+{HFoG7c\W \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pE+$p`~>UopGpc+b+b+b+b+b+b+b+b+b+b+b+UBznp}.U։oVzfb+b+b+b+b+b+b+b+b+b+b+b+(p9W({>W֚oRvf\Bd+b+b+b+b+b+b+b+b+b+b+b+`FY~ywVo):f§o݅wxìti{ k^ZQx̽>-lYWv}f9˝X֗ݘmlIg^~*:^TY-׫]4f 6oܫ'5 a԰:OyYʛa8/4~q_&3}Fi`?BИelf-oHc?rw/lCaH6"̀^8+ *$Փ\6*.;~~9xlhQJ *4OdǃeڄT#7v=5q=^1Gh n/ի.Cm1awX,n͟^Q*E&(޿l4) B]6:yX.!|\_6Gv۪u?;>vֻ\}KƇ}om\1n8ռ~HׇK'P 'k*H]ﵛJ\ܗ7Saŵ^+?Yx<;@rwEyTuKE-qobgH(_LR=fɚ!+4~jԱIAg%rhۯ-MeLtTœnMZt2&:uF=wz]MkyӘi}fs}/匘(}dxz_G9ѷ~ճkOwYfywg&l/{q;^_V8fkɖWRZbJ =w=XHm2J!Drj gWD6%$*'!~B^qQ/alj-(^IAqDHdjAtJZxN.jl EunGgo+}/Ǎ;>S6y/,&HY5 x,u$g`v_tk5VdwUŪ}|g`|6H[ȼo_6z:S{4_KhFv2ӟ@}^)kS41$}ɨ&RZ8` efY2224P5ToIw\Nː[^󍭣s՗,0d' "hjnlNh31)2O`cM.:U %ɬ؄2Bwh8Mf ۬nק[va>re4Ny'ݭ ljы=8zmIlˍ-~h)'"S1PWh cۭr,fϬPcS߈Jwh9Wxx=oL{F<Ӟ,(p %αV̠@.YWqPkƟB.֣j3[$j3ug7\9K{Y{aL&c^ޥ̈́T)z rHUouEbJ|Ur ~|.vZ_[iT't2.ʱ.J,P;w6_/N1U (,IфINX'(9rMќ^;{Pu}XƉr>r{V]7F=7${]J;'ǁ;2dx9=|kv_/4 Gh ({`a+y GM, :8[ )[K#D)B,.174%G-E7$Gͺ1&N'Mz0)^ODWqP'ۘ|9_iO43O; jΆAIdsM,dQ:)xL~KmbKaɎ N%p*g ͉OCx &毻HiuhFOo/򐌥Tˤ:ď:~e6?p@A!x.Rvd C~l$~A.97#S9^[,hm`2hͬRSQcM\ mRƛnİBmR!&PY^bIҗQ+7fͺutk+d^!YzގzC#B! "pak0l-I06Wy,egSZ P[uoת; 5_Nfyڅ뱟Vv1ĺ''uiDv'0>B6h0OuCIGo{d+y ½q/&-8v\r$Z[ug7b8ݓFǡQE8GEe0#'"h\1D'A($ 0Y]Ke4VoEc CJ*:b5)h Eđ1x6DDTzug7~[\8UcDT"mēc!Qdl(j6`c$SZ-ge[g GRk1{YMO)(gB T6Hbd% ʢRZuDj tSY4sZl6JE.nut Ad!M.e@(6WF9<@1:cP ~caq.xh#@؝S؆l_f#wmsN ' XQN{8 Σg>u˹򜮨9qrT8B+ƃEj3LSA!!5#AoH#$3I0^QBA&&LɣD4`&Bx덣lliʬAAT}1)*J J )”֡S$8^AYR"\cfݬ;{:BNtǗ3zxx<_e(؜à_'y+8)1@DUE"4I»"҉XC$ungMrc&z=d[Xi tv ևAvehxȽշYMOqssu,R%Nhs aӐJg_SL\)]<ǭI{&=nsԧVjXڞTk[b'G!`iK"90Sq@G  hxqƜz$yG>ޅFh JAGq?[M':f)LT\t1 l9e"eGyK3*bەei,AV)1c$/L9jsQI,, „@[+}@$T_B'Oa kE{YYlpמ\404nH:j->b~ف!\AռIA*e(#I! Yt.Mu$Aے:Pf&`BȜJ<6fHYP$H6B4s 8BR<@PT !r%+6G H&:n:~Ym>9ќwCFb,p2t|!YXyTrxZ1R:2п&u,c+F 4 E 0jRep'7.= aaiN#fF5X3ٴ $ W3B8Wς4tN̺u#DA'L ;4l/4q#!#fC0R\?3xTGzW뱃[l2RjCl[:5#kIEy,laICODxIlB6Z jS% L4@F{Cb43`_}d ,Z` "ZloI8 O`j:/o6 8޿2rygv@m2E+E茤-zGr=ej͜L ofgsFCjf-C%C:,RZ]?R l|M<i'+$hE 1QhuT= (*(0|?awz0%trڼ{ogs;";%F]VC{1 ʆ3R=V_.%*4οד9?\got;dNSPbluw}v4uy_Ms֚_.;ɴ7׋zLƽ3h6lx7kͺƳGAR6|{s]^ϸwz4ףM%ۮ>zl~%EW9~옦\yGbs'^ܑKo~q'h?FV&oOh쇋vg˛^|3n6K_v\gƊ LF˽_T>Qy<\ĿqǘNBw\zj sU~ҜmJPYdl ڇ?ha3)byVnvs| ÿn?~LS?_?G[gJEBw[?vR[^̌Ց|Kr~bWfَk>ls'] %ˆGҲ*f Z=wIu- -jT<_h%ח7/v)垘hz1:%-=*FMFEUǠLJ.x 8nX>;3 y5ky>I]1do! .)d-EC-E/zM@X.~\C+³Ǫ7\.lK{C4DC>ӺW`KmeьhK' NuOYZ?h|8hy>5Kxa+`MP"c2Y\k,T)* с{0T5xob"헅jz=ί1,碟A중N (G1H MFL|RХYI"ZptR@;LuJe(JZlT% ؂ugY߁O'yٔGܛβ5Ţf]1(WR$g៍g Y9U3ZwJDd6?A3Ĭ+:;8B8i C : :Cʦ3=mMIJsCwgypE˓γ["(Åg;EE !OLftoᔍT77]G˭+hj8cKTq $WёҲEdφ jVݛgѹ"4_qA|3sNˆ,10DIhKb+HV\_,&1 J0igw4tU ]5'3&ѷyq2jSpnW$[>Ve9}C0el[ f}zwqhrOė}eM &0LitmF F]6(1p)BҜh/7{T{ߞ"+u::ODXEtlDyV眔RPnfvv6DuA|ASl⩝:8j_鍹 îžVdEN_&+u*Gihu:>>~kS^z' Vr 4(ecZf4I31] ,dfoM)༷TLbDMl-WH{o2FFk۶]"MŔIͲMGnpuf[5!=ZQ@;EkwF W}j 1] gSw=jNB0yѱo!*OmZ޻n{(˥\=^6?~[j?pw [tsk } }1XDzqs;q!X*.rdڼr!;Vn&U>YWo~|n0?W+R*A4?~4Gg^n֝3@=~*ˈdoȦhk~5?,eSkܷw棚nf]]vsHoծt\r9[5~T߼}Ӿl5zh9j.˪< ޽ՙZ(F8ZQ0>pQJ6e|!^ m3qH8F<*ExKS&~\,T,8Ӵv-<=*S"!c>>dAq'dʜy$#/ENP˶ZhV=f[ =zeNc}5HSV*c+>ʈ1 FԹ Yh?Ai;|VM#?،L<w/g_x gOa)P?a|GOؙ^G(~H |.Jy]y}N:geIϏO\ ggơuO.Ճ7z>SK4HVԩ4{6^ӌ?z_/KO,a޳޻}LVPaN& 1뽔̵<)`UؽivUٖݻe9|h`xPk2Ӄ&ӆAN!6Wg$}.KD`RwLS0y68^L#̿_d#pMwAz W`,?K ?p/7OmNS~b[cfoYջ]p~Y"4u.U?ͷ}Q{˞Vk콢n8f~9Zk=jn;)9{yv7Z'8)TR|LYB{BD8֜p_j+:{cG d*Eu{}Zu%8kpKOhlzĬ=Oe9{vm(_k]nv8Z?ruKA+Xai3i{SH ܇>4Z5 0_PZeP;/:k(QDŽ2I(u}\^=N`eoNV[8q7>NZ+|lo*!'6G^Ҙ6+cox'ζȫe!Λg/ Jd<'WAĵ;U+mBؔkŵh<ǁUƄP'@$ThyQ4Nk\uիtv (]oNg7尦Nff[Dۈ?sK7ˑhŭRB⿽¥k[8DHE8+ nd}tɆ3ˍ1M?(U6Kwm+Z~Eirk6pWMo*mڀqKV*St#@ Iy\?|0)sϱ=C̔GJ<uYo~x'o+;Lp3irI[8)gtrEYP@/gSV-qpMp؇l@چ/?ReǶ8h2獸) XU"5I΢6\ H^N[eD({%z|;/ID58q[I,Dy)R5ʅ>s5T2NJJn쇙"sb:eaB $j] L<՞Jb@Ni50#ԐQVi1[} ̖_l)g)C+%hY&I +.O´8Ö3C1EaY1>TH26Q:jU>{G,˜WI4j4H56d\c aM,:sz*D#"he WIbF`vB+Y@~FLd"E Asa<)`J  |Oy1k1|9@VC薇2H7#3(`G]‚A,TGg$Ò$Iim!a2 `| vw]Q{ 3ȓ!'Bq0˝(93mB]-}`"&1:%)!);6}_u7OH/A:^U}_TE%8њw)a .IR3I:@LڐDz02^)`if0weѹƲ&0@hS9*O@% (H3ð,xBZ3|8z0V:VHAS ӡnKñSߓA{70F'²#`02Ms( i=Z:Y*E\jl"t0e| ,5Y/Ga n.zRAo,,]Lw`JFiUFwt0jnCJ E+[W%\߇ؖo+T 3t qQM(ޝhMoeR?MF>-`D*LHm/m (b7?̻th;W~B~I);^u\Algàmoğ ųՓ0W묲/KHcHgˍ 0]/?hq&G_W |-ٜ]KN)jlE)jbd='RؖˣItzK`Y$MtWIaa%sَL0Ĺ( 29m=c!hdO;w4x{j 6 S##W".BY1py`,%$j 3Ixє#Q{oI92 ,_\?H{eY?0Io{4մ;,;˝bLzLvy<5h'rV Eq`[y>( +((jDQG1c|ޖ{ dQS N!rTEuĘӦSőjβsӌ,zW z d>WTsFϻT/V{<${<Jf.biB뾤YGY&*mɝY‡TjڝtI^A6{4 ޮy{wPD쉦ݵY'oVfZޯR^?P: 7wd4ڑջe0( m'6Y6^Dg8jy?LB'zB4şz;;ti 4]z?$|Rپ2W[i:ﻩ/v SoG⡯zn蛟7gJ.܁>oǮ7/=trƸ\?Q r1J;kZN1NȲdJIdrQE*GQ 4;ݻ8$C߽O  ;(PǓ6_.lfubUB8wVtK*ۏLtWvTU4bpA)/ǞCAY5|b pS\"pv .:eg~Tg}y[٫_9)Wr_T'o1;\X W<:٢P=Ϛ.z)ϊM+7zIJ t"bP㈯"U ^XXϐX,Q/=d݇1( ԙht6,E(I?9S@ ^`KM"h!Vםq))\Kd)󬄐@=ƘaZi38 .)H; ^M2Ѓ)݉J%΄^e<> 9a2ڹia;1gk" ͬ u2|&Ux9#Jy#7D0* !v6IFMA'j&'˜a:, $ KF؉ ZTM^Zo2&)ĺ@y&˕sy*]\N1JᥫJ+ٚ5ɯd}AIfwdH6FhU{ԑ2e!e(״' {=&y15ɗsxMQ2NQ%/ʎ|,fiHT ET$@h1٤)Eu7 IB<bXg#Jy)MF۪Pβ^hJ8KXtqTvWѦ켍(u "rKLcU4;): #po>g4}[>q]0G噥s+K̀If)mϕôC0H2ʿN_ hS8oȥܼzܮ\;œӇm~PIUmu-̵ ^re1Yd\啐Ȥ9e-PQE9\9\I8 J[:d^" &9xNEɢϜیKOɪ֎:/VTs+ Z?A8\twrDut&u:Yŏ6V'_~AZ@%+ϙ%L%+1fѹQ",(FybZA1 y]]o9+B]m;@vѽ }F@^&8G eIeLr\OU*~[%#( )9Ne3r+X :8mX!$$I4.+(QUYsv#&EO ?u {ɧqOV ƛEZ8cs|,yU D]EML(!h6cPՒOL8o]oD, 'wSvl `Xrසy"}ビͲXkg}E0kQE r'(emauÏJVB' -.Q0?` 0$-b-e\bôH "Ĝ3buE"k x/6iZM!S XR *֥z[N閳-ifOX}qIx#5$'J*h_̂*i,l-Lxş"dɠH(&9s띫VmnoX0-iNDZw;<f ^9Eϧ} x ^<&}ɴ%%v\Jl57/?MQyʗ冑-&FA&dsC)ٙɌ>؀EK]io+%%Qѳ)bQz(olk!FB_jB+bdt, RYsv#c; Cc! XXxբlʌ7,{.ˎg'tl|dΗ詠.mh`"e4>9%Qq=zU UER6ud5:t l ~(Z5g7b$池vq(jƨE&>`D źa)d(Px[No(LĶXṡX!33UtVkD@e_D6DR*r16fٍ_mDV`<Dl6>ED"n>#%ȓm~T|$g5,shKXBlkc2ZjzMMC80eɒNϙ4h:s Ʒj,_ub\uK:fP\q 8V/RpBɈ2DƁbM*kYUJqŇcfXaW5r6s5Ylqʫ N 7q'i*Ӹ[˲r]W9Yu8BB(ƓS9 pWbH iq;B86Gȃz!jK{GHfB P+a4D&L ɣTfЫHPtIm#АV_L58mOH0 J-W$= [xb1n֜=!)|::< 'Au9Y"#><Ƭ(RJpE!E@(E2!&CY,RTF!I:U7%y=lrv=:.qt_tMͲֶniWR#*ronV?F8w0?+P=AC)V^wZcv5,A>Rgc&UӔFDV,0 꺩RI^[%qci{nɱ}nb[c >%Uk6B@`k LWV$BUNnòޟ>S&*YɤDB[(IfL <=īI:LUtJ0 u,^aGOO©.!̃{Ce#YJbj!e{qqݗ0N7|jeJ?͘Ml^gl!cChssxlF"0鋎e& =Xhc<gR662'j]- L8@F[Cbkcf>2/@FD*FDvYHh(k\El8ݺAw#dzz^E#hJ7P1xTOh30Hh0X}3?W9Ooޤlv_т8T2Q_EZ+Ӵ|bhd#ī/O|MM)m0AP c#QfOQ Q` z?$3Dt#333Dx-޻E|k|N;c7^w[G-Q}MoopcEY_И~)_oݧ@pՖ*[.oz{=9gdڷovћ>Uɸw 7Fz{o-/N?9cPo{@ޝsܜmi|7:"c沩!Aܻ!WlsF8aC{ЭL+^οv,B*+OG{}RG˛y pqw_#ڡ;.<:]9:~Fsu?OhMqE`o&E6RF fjǭLW/9~ o7w8)~яig:?G[J>a6{dY]mбťnR^-zΫiَk[nv)eMFXVg]yAyEkFVgpU9eQbZ.ޔh0FxBҚY,Wm5s/(C EDA6;VXuXa"# oE92kzueq,E.£ǪgxKe!i5>A]'>z  ?5n^\7guB%F(dbS;:DG'kOď|{=W0iWwh}/?yO_@+*:iTNL2xo !㴖dB5Zy.0B\dzy>qkWؓۑkZ?_ӫ+z)~_߻O鰖,&qw>2juW~xr ) lCN!'ېm6dr 9نlCN f`ӓM:nf)lҙ\Wa%Uʸ-Rሾ}m1.;}k펥LЁm٬:[3wؘ r(+ ̜>GIZs"Bh+HCE&!j:9CMY9$@wV YO d]b~/ k6kΞ MO WkN%,&G\L3mûKanMQrn'w>3 0%3I ,d1HuF>mI\=i-M*G!d q>DVl6"eQ%z>0oH|o]{- 7bu0FP"ֺ@%NXe*eNYr00֡y9n?)4g,h "!YXEx%RH5'aR.kB*X}'r>Tro,uu?\4L^J%KV 9%ap5crM`wR!7ܽ})(rcJOIM쿓#~QMV;Vz#;w[kx}M)42Hv;s i@эrrݧ䈊ъ29 vV p BgRpP;DUAǭAf[( vkǘ8)j캊j0͡'|lN3#V(dyTj ֢w&&Ef4 R(σjI'&v@]oD, AOy  {Co}DFkkCEE>O>QO<Z3]\,~# k( lkscZk ")s6JARxS"5<ÊoT sLI& LTe,)RK-Yst ǔ ϑژlᅤ-B}D(>5RHN2xEUhkik#)f"&iJdW )B *ā2{-m2>޹j՜݆JRdw;<Fɱ0:p<^ON7A2W۞~Os̰z%wf_ԧWBb# #&R # 2۹́dFjl@ɢ%dӷĒX>{ö`$XU,A526kndlUaaq(Bc, @0#ѣz-I5'^rMQ O!{.2½geɕJx%fvĄaUBR >Ia]O'q<];ڼV'5IS k, Lo)XʈLx)5Li$~tlEJ !RVtQD A1`&((#ɐ sa1rv֨_Dfx*Ea(8hmo,b \Jj|[5'$f#l(!ghN9fmc,zEM> LhI3TCbr =f\|ŸP E9A/n3#sh#Xˈ@QIr?KC=D).f w^<^<}X;NET؎y]v |#Z5}ZяoZk!Q*v!Mh{Z;U{ K3; d|J'))@ȃh)Hȳ(8QN00q)繒 <%s2V/"y]Ѫֿ19%jUԣ~}^7Wf;<s1ps.rW$k RDu U/DѠq|hh[& nwM\@^EE'lJvp v`j1l6TP3[!hFg pSф=zr {1ewr &"iQ p>IgP?, 8pdˏa)$o |9n{]Y r>IHSω*\H[x&m.ATiG#d^~rP bxx0$py& SÑuPƁr/ P#ThUaշ Ywz,p~DzWyͳGlf֋7k#& JX@Ё'IhTβ(p7QpA*{)E5dP^!O.4VHI< Q"B !.>DgZ?vT&qƚK=Je; 06%" YCK*@TAI M\伽зZp`hQE-M  " J< b)BLGs唕 ,MO ^q4#çH&wMLĭA?_VU==H{U "Oq9οkkq&\e(8/9b+yY6*4"EoٟzIw>Zև--l?,pfVBZV\+6a4!<Z,KvIs(wR/c+nxQûW\n.˗Gj2mЬ!\CX#LyťK^L{/&⠬zD/?=e*(חk|/^T?ޢjq^1 8nw6w9T.^ G7~r8тZ5nqD|&3_nx7 cj sr~h0Or~Dodg?2ƹ3F4Z)/2t)mS5O~'H8k8)Sw"Kcoxq{%-mͱCo^▅i]y̷P07}?ώ&͎99ޓ.v6m#(S̠YŨ#e5 bƼE4.N6˖- );e])k|&b^"J%Zi F()G$FǤXr8˛ __TP߀9n1BHN`T)QR`c=HJ"1Gf\GuAQzC[t)!^ .M<\h\8AxgKP- #g何6=|WskNL(Ը'w=Nu[oN>%aZpYMLk'q*FvݎD0Ĺ(5MPmL!&B])x?X~zm/cDV5) 0S0F*H єjp(mqϕǖrLL2EO1T_ºWwK"@AUħxD"4j1Ji)*0oL5նkR5r6R})&NKO! Ӗ(FW:$.i4|2Hf8HKRO&?}!M4_WόRGp;ec8I}`C8j xf(գM}ǒn߫ *|k4հvƿ>YKjF7έ4\лy5U3H0wiocyPV@ͭ Eͯ; ^dߙqd7ZHgԠm}\Q[rvGm\9k渧ܜ֋k˟(mIwATD8ۈ Rb5~gF4d޾.w~iͪ\lgu^nG_qqe&3ܔ >0JXYWё]G4u:/8{SC b9Upn?\:! 99s)Mrݳ86I&(m`8* S\I[D#%4罥B M$FaH]rgW3U#gOn}YyGX DrwB~zfݗ_r 5BNINbMdkf,,w6}oE@ dbJgM6I,Ug;ЛtC["D?mZ,Ktp3T,Y7!O5 "%Yz7YKD2guٓQNF1oqrْђ,#/`*GgzV77GLt^mPMI_Z7~km7O}\{"Uuշ.^>|%tC k(of]϶ UoTT-)UۡCsZW6x.zڢVJ/q )uĀՂX[-EaNen{{Hsg!Z;Q!r쌳sJ>K,h`[FEL1Vz#0ϭ 0}M L+.[4+X}I57Ϩs pVU}tzxC{ݻqq2+\b5ֽԴa>[RQcPu41,T[¬h޷Y"!m}?J&h=kmH__2`A^e 6/%)UI2Ք(y D8=~TUAr"&Q`ue`|U)=(a:&1TJy($x:gU)`h \IkpV{[m=9]gey\bVQ'|If<{5ud2k#4SnJ%T(E,xNHo={Wq^Az-d3z"dH6@<+6m' +YU$`%\IYha~YPA@U|=3G&"r LVpKFR 8xe.w c9+ܗBPvduS9fJ9> ֎PSZ'=J{Հ^@>ݹh]y';?zS?Lavw'r3EUjrzdܟҀޕ- %K%yn^)Xx|כ\IZȯW[1--ۭ4yQ޴zco^<[k;^Z9z |77-;pL:Eeyyג^!d$~j.:>OIg`'ˏ?9zt;{ڝC:"2%x;Ax,i.͝`~^XD&O]Wň]] ~Vw1Rr%%4/bY$m禴hh~_ F07^>˫^zՂ\hg h.L[cQG@CRN2r }rE;]T^.Idȥ24W ԑV_&X.DrqHv(MZ9Gi9Fiju>QGe=֞|z\2 SB9Yf.RQ(Z)4YgBf+X`▚|Om+/w(?!^+%o-, v\fƔc'e$}]$.㨕Fo!Ol >zaޖj&8W5h08#f]dzLR!g2s"yl1V'."ys*S6jN E\$ &I:a9T2l,Jߎ21srf1=Ki&}F:j($"j`+kwB+ٵqkYv9U{Ex/^ɾږMqi\ ]Mi?z~f<MNuqRŰ{q[+1[a7U6I1f`+}|E,~ _Ny;Q件ܰq o{}-6WLfmףpYGXw=,7gѧ8'Hһ >V4|eX2Ï& ss!*ͦoL#uZ;ҭʝ(`b.[#IZ ?U\Lgt\]ųaQs@1BMp:<\X ͵-]aJD?^4=V闅|ƾmv,Zׯwl0-u@5-p9)'A|^4씽. T6ܾľ^M{aMcE;]ә|A^.)[ n J5]5do0c`s儱,!0I<9 VjsҮSs/?B"sض4RZ/yr{:m7vf!leûݵ_|x z^iJ-'|N_B]Mj[k[\NRfc9\uu'B\:yW0i,m[ fi\rG hb9;@5O.Df5W˚YZr0ol9woAwZd!}T4jI*GYז (^֒D yy. sk4lOL#9psFyg_KsܬSUzEY>f)pեbvػT]*.{K! -R t;եbRwػT]*.}4sBT>i8l7pJ05\ꨬ%5ԡ l8GSBJ6]$t. gcJp+WP VRDLF:˼6ʈ Ĕ i IB7k8)Tvfzyl^?uw8%$پ|"ip:l%nIaڟ&7s}3=3 ĆiE *H@iJL{殰p=#ig gZ8Qh%O -r %a<`^88A6 yH\G!̃:UySƘ̕qfǙ |NWs/KDSNqbؖDŽLTgvCusZ ëOW<֊qp?UHd9u۟pi̼z"HNԒ )c`LuHS/ G4ԙsYFc!L2@R sZeƍ$\K3|Iq/qcx3~&sTCm-,ϛ˶8;WP.EMfi-M7'T>[5~%s!H<3Y6@37ɞݯ;HH,b 1ı—?3eڐSQ˂cyvxNaEdk7 CKd %ƢA,]ZY`Sv%#`Y#7"$^gЋIb+)nNZbeF6'mXR}QQ+N=ùzziĿ>!*SΓ, ސ5I"񹉀Yct&0ÕQ'eUh)Xx+, уAcrga2HRTmXm8%c=RMVce, E' O* ת[̮ 2<Ooqs4~0}ߕ%v.{97 0VaUfB,X $ж^`% Zʙ;Y2e'dB"6Ѩ`THndLCeѺځm_b0"QDZR[Vڲi , dcs.ʅ8,24}KcSUfmA2䊴(ђIȢ(\4IFQ'V Z_;q엇R?I8"DZQUDEHC!J[b|/,eGl">fLǓ7UnH 'VS"9RNAíD#iBY-W8I5$JndF%ESY.N.vrqKPR0d\搄&H "VGͼ2 8xY!#)jq.'O *}/.'q#Qr\u g,EjNFz?*Hb=QR@&m<Ԑ+A- Z4Z#yԊw$$DR\(x"$xRa(ހC ~Q9خKǿۚtLql)aǙK%!Tɠ-LgI >0E#(x%ܪ\OrdI   H(c팩ՆoQzZ$0sLC=~ =hQgͺtz+-v+:Kt0Nܛ0*fc,'(4$MթXkS 6uZ_S}`TMU=:+)j\oq2#;E]fpN{> ~֠.YJٌ@$ȔA#' O: ˔9`JFC|.ՠ͒䙓S镩Cy-.@۩0Od}UO.=hhT`"wr#р ޵c"{r/nwfAOcfi6xuڲ{2zuٖ>VJEn#7aH2$8} >mhu1e@!Z$V`gFܥą`Q @3@R1ygΙ28x9jcRfMxh9t> {y>9$T}T0GK%aH$31cq6 LbC_WvM{A/PM@@ ijF ,e+7q`L1Bs6Bߌt4>EC(%Cx>QLKj^rc1 l|2  &-] v)V6iwnvLt,Ka҆ \% AKxʂf9j['st@ޢʮʥ 5lg>oO2K#.{sJ,%G=8zapxFouGe q^d o#a U}DM)5.0q Yssc;>_8h9u hFGy@k˙4$ x^y(o޾BӍ% +ߖ<_7m '8wPAx(A(g"s)Y]"4tD|F03)qF7Z \)8CԽaeQԖ'+@ha;}v&:eLt, 0KR3#K(R-kcx# tnwȴQ>N;M$ 0 WdUNK $&y vȒ-ڏE(|?7 axƍ4/d0db8%&P#IuP%% "\hU9Ha @YW ʔ~WỵZMGtplY萛 NfҰ@LOF%xTΊRʗH;eh8gcxF؛<_`Ȫ p~v%X#Ffd2xAEAM$?_5MC-k=iF:B ͙a)1a"KT[JQAFP^iF_.r^^[e -Vp(`N= fvRJx/ 5t‡4) "͢d ]q<@Q(2]GiJ\y`&ff|pjt'\o.k>n[ܷMt.VnQbq[QV)xHocqLg_:]bWEE_ mNuzwihorݪ }*?֙ n4+V& &iz}ˉA%jZOlՍw~1泇pU. w~9L/mJ> 4m~.:bqoRClU'pNlUjEZ*ӗ^p.ra~͛↹yzw\*OdI*vHzf@ɠ{Ryqs7N~_wM_4vzql/?(-s[(ˬf+k}._b& \6kznS#f# Sg ]=w+=ϳuG.nFmfݱ,YBf̓t{`{'Nݭ%2_:Qcߝvw*Y̻, <=YGAh$VypCA9o `"d)1+jsu6'_WP ߀9En<FM2,S 9̠¸, PU1zEC>oڍCXM6x7l۟vVƅo&ko.m/ JX&Ӌl`..>ѿחfۗiHnSx2fnJz"@ewih+o'gn/ڠ׭LYw$c9vgWz|9I=`/?fTzn]2݀O.9}53 z|!>}":{ەE?ިl3LZ ߏ@.SF0&}՝>5hA?a`o ^Nwm!tIrƣM(|u2;/gG]>},B y'5[Cz>m %=Ik H`(A$$rK~%\-TzfH)=bn Yw9,PLy au#xloM RUh0(i=eAn1$P9嚦oY>?,y}}V%)mJ?lpƓ"tғ[TzU@&$hS` Kˬ PҐN*W7YKrwO'g%?7WZ/E+?+nZ21J*0Ϝ6y`h,$2fSLL2/eRړ?vyZAyW%'/F3t)_lB8*}]_WU/\Ÿ h*=8Q_WU*} p_+7fECOnqPӈ{bWT"o/՗/bIo4VRQkO1PJ!,,C@)#@BS4Pt*l)faȬn18 B\$zeete^{*jٰZ}Hv>€SddXS9j2RU AqD7Kh9#Yslʘ/scH u{1fYq%UYrA' Y/K ND\&#U2V㖱Vf IƶPT{ =.xŶb87I*~GWn6ٙ#LiҐ|H2(`ѓyJ`tḥTT  sQ0齐MtPD-,v,&*SWŮ֝-]%y(Vtlkee-{#صpdHBX2o9X.f0GN@FM"PhKźvCĐ r ddkbbA$LPS )"FV UUպ=lI Cc[-"$X"bIdߕ*Y$ H>dU-s) ,p7Nj r3>` Ʉ@LZNs%E֝ Y'jOvq2tH{դd[*EbozY%eQ8t1Y+'d%\rd)%h!Jz}CդP=R pV]# !{$3?Mj-E: * 7gb6{َ4soOv9BH.[nq;YZ[cuYl 9x/A9sr"Z /0pbxHke" V|N$G..A'S):@S707Iǧ.4 TƫMPq{thD3|$evBjYkv2nuJrv0$R>VE:k2 -s(X-m v|hj]J\%hzq 8 $~B -ϪugC+;f>-= j4gqVݮV?.)#wJ0Q`%:y(#Q *D`hS#3_o6 *hxʚK=J40íƄIHȍ b$܂!,[*یAyJBHM0fB_ Ch!>CCz(۲bQL $`k\B" ޱԋCnK_MMLtL%;ut Y"1Y[!BPM:G\S{•*BOf>ҋMC~9^tnB)^' zhѰ%-wj&NkuUwyM_uWk?zBM'=clϯa)m0/cT単9o _w=yroK9^qkxkFz2#DDrť &RDgqTFz}&?=e*(Wk|/^?_QjѾ0M+[MZ Ɒ5l:kO|.8тlQ py=eMqF}<<|GAeFƎCVF}\ C؁0H@l]S\V=Wǟzåvd6_s'_礱 p,\/l&ܡc%B0[ҼTVlp+=[Ll1 QHZ?k+8A FGPhDjBQ0wf:Xƍr`zE:Fb.xP!"e1Dy &g[Ni9=$$0B spCc>gc2F* qQeDƬ6 H`; afwB. )~nW4{ %F$\zb4)Uw/1B|컻܆D DYR.K, U`oKUxC-t>Ag3nZū"!( \AC=5B1wMn"_?E@TbDF5YkH2_1ZRQv82oj%H"{i/Vxt{(㞚![Oҁ%$Ŭ*j/,1nAFic"H,)W 9J)s aMB/lv6Ba[ܴ ~db_.4* H{G#Ơhj@9X!zs<ח䌳"P,* 5%IRHEJ؞%7rKVq6SW5ս7+_rnS4d]1XdY4T#_USƈ2Z (ER%e4e,bh}r(xͥceEQQd4q` O<7Nz<9ed2)DTI&GzOWt(2蘖=;\tTNk+*FvՎD0 Kx2jpM ߗb|ҏ* -YW䁊H mk*H`4%"Z=UZx't2.u)Q$R0Jgq DژӠcV*e*U(znR3":"He`X!:\ gsApi ,^儚)N-26i4| EKrt+K )leR\G ʳ1Z:K ٝnѤMQq*$;y&F[}ϳ4*]t,DP4i$ԱU\(EMa'?ѳYi\7*6D*`ug JA_}5,"\ $\́fuPݍơq0ͮ2]E1<.~ZCnP5>iL[6 6̉ڜF)0]v1#ZqVVUuהۋ8 V5ypAT (':r){f [G};ouƷ8Y3@U:͹k [3[w}fuzо];Yf12cЪFӄ RYW٣\4uEĽ3 aykzy" -s)Y-Mrݓ8N +$W'JF0Ip4mA<§ `8_.e-=#?\ : E"@h,EַtJQ)X.,ΈopN= Ĺ?Qiauv)\wmD䯄^lzI%ΕEV NB hK2a%V-e}~GCFE7lc*n݄?fX^CN]:\C'"&  (bPh @:Yg\'.{cq|AUdlos.iVt')~B]x(>}W5z\ջ&&OiC4xRN_3}:>滜mmy7bӸU+pb5(LTtڲ頥tTbB Rꨦ㸄S`(sP^L%SFZZF*R)l8bDtYkܜAYM!/L0o)qɀSjE8e(B6#q4R੉uːmWa  ǠxgF~sBGL[%q{6ǮLr&ܭlPV%9@>?P!GK3RqդMb4n HqZTvՔz; E,\?fjhL'rrG~U!-'|!|\\|6<j9\f lMQ}f[ J湩5Z0 *zr\o8!l5]FEmnф3ZL~Msdl2;4TWk]Af.IͧQ0ˢjλI>[5dol>f8`AɪN7OvzF-g兓O_WN{άs NOրS"85"fOD-kt.6R& Z3P =?%X3!F "Чzm3gl>볯鲃7{ߝ~{غoL5߅VM!R,9^ZzԷ6AJGH*%'r, CRoK. h^}oPS"uz>gG=׾=/σ:/{}hXg@ta- c"RHCNqh#ʐ)FdX̚P rcN!F/3CKu&Zm i D@&_ 5cV(eT{U!K2>(Tpx7n6ޮnaOB0M:>M4.=1V<8Cn=b~ğ4}Տ!@2Cɥ `0I3 ? 6v4 g7r}26?'3=Y-ґbU sb|iUw0>G'Q'˰L%N".Fv< 9GH2(Aj9j4 D@Hqˏq_hy=(}<3Yv-*'lq-¿8-[lqqOL6f+|wzl rU7Inݒww}zM.9E )p*uGdqT*c۞L[zm=,[su=Qqjrˁ[zxo||q@k-wlm7s>dϳB>֗_*bm dzky5޸[n3lyizti޶n閶gw\,m[bUo]_i1 ,qɦJC@ DzUjI8#z%3LkI'uYӖ\FCʆL DdXZX"@c%LeTjMC(75΁^7^I~I>.Vz 04{ mu?%ҵB} f)J"KIH>S2J It[[zҔ1xK^J}֬JțTSNH2L RXJ7~ g9lsb3]nNyEg,^{? ܚ&Bơ dIN{oH=H dPR2[\,zdx)AM>fuѨb灏GR!~5]o|$ejkq1f|1۔e1Id@D)f65Itl"G6lO|_//S~چ;z&Chp "mC^E:c.e|vFH Ec U0_=B!tItY,3K~Gj;|{{K/P/wax(2#CЈR&1 h SsTfotT"_6ѭ] ~[:ȮIzɬ*|=MGGz8~f_c7''X9C~ۣ_>8}n0rw8I_aE}8V\-#߸jyĶʟ'm7XzC!SUw]8( # Xu,fScaYJa,P??w 0)hHB ?D=C,";zjz^ dI9g- ˘ xI: XK^HkZRW1|7~&m|"~^޶ 6 [-:;P%ԳB=(<;~^pXБ _Ġv!AR%'_HzἺACޖmkkdkmGZ}<@>{k˺C|Լo3+5Wq!Ak8b=bG}n\6!i(bLty_@DQzDLEr ]Tc7fu[ S )yq@'(/%6J{І݇`>}@ }b.ROj0G ,(!0eËwHIF-Ɯ;mfUоgU8 Y%FeOFZ%9BF3, \҉ɃrްK̒; lYCUlGX;òek;<5 ar߾qkm0ɥ4(F m .">(;|$<{=:g9PRR-A'|=}(]mF} gɐ M(I,MI"y[dT]!1  )[]fُ3Wz=n_WeL6[kw;z5)U[d&AW g«((hm(KPAŗ$M|48jmY^Fw{uY^ :)D )EB͌?3PcR-.ik{6M_]~\Y 2;zfP-I VTL_2dno"V  @$u BH%;kdaE3I3%S+Vl]جp~,/L ZAlut|!(-"[cH*쩄V(k/0"v6wf̞}9M&]hNh;G]64l2E da6dʪ _FǚK IJM_ ]wV`3&kA-a+~%ny0'a/ #ꇓc]pcԮq2801C$ C` 9-UXDr:|P j#%H(3[WFIK^@JZb[$>'' %ntSPhѨ<ޡ?^n=XNknHXbGȟ?g {8vtLHGCGΫç/?u@aQC* OMɢ0D2I9#f|B0DUli|-4GIio#h22:K80ZYT(AHa\ҪVp#։q/M?m149Y6f#lVη ouOi_9{@xWBftMY XB#o^ F1wm"+Jj6#i8{ۯ7ʤ+"8(v܁o;y&~Ah_o-B^0|όZi1E5SRh *Y"ϸf<+7("DB\dXkaX$Q)]Sb43E*Ma# DVPz9I.3=dJ ^jwQvi6rvWoVd.ˉy;1 aI}tj*,+ 8$rZQ )U[A%`$FiYOd jld)$YDш m49:'0~sj8 Ki>i.WN}r/MTpΘ^οejvL=yy:ke nwJg~}$ck[ S$s)5 ˹M`Fzdɮ}oѕKbQEKŪRMfYQ@KEJ6Tkl85c;L6Յ.T]>.QNJ=7G~gŗEZyUܽCLf'O#;9,d(eIAN:@Dȍ)贈F[m3.P^JVUmj뒅h!;Y0Om Ii_c4Qcv}nu4y,@b^$eDźTdU-nr A:^c]H1Cfd(YtXd*R,GQ|! ,*"Bc}l8H^DVx,FFq׎GD $8עHd6b-QZDl H7G)d@*zE Tz&ȒNLZIHo\gs7{GY/.p.^zŝ^~$.1>J:>(DatJЋǢkHeOI౺_ †`?m W$G eA{Nݙ\^J0`IuNӷz2x*xG0%Xc媝9/1SnCn.Q\~àWlK7m0=c. !MPYϓ'7ǽ!@}3ZRbSυSiiBZ !G*<Ȗ !L%]A4GM$lL(5ӓeGp~vGM*l;N:Rum3MKU9)I]ʼndܳ<2wzt>wo_"n߄/z/EUZeCD,۽ bTW]T U)*lk8ʉ,Mpld]'K)ܰ?@0_{r=g}ifc s]^,nm-3iWso;7_S;]ʝb^7R &zR < ksNŷa9`^E?T,UOK*rz38&fJ"EX]G'_&៴T =FǦOxcɃ4Ɨn-w9bO/'MbC.i[sT{4آ(P){Ŕzn2,_͖/~${㋩SL8+W.:k+*3ދ1L|o^z|G]ZEPœx=?.loMa 0k~\kTAVE\ 9EL2˭85s.7;Gkz q e kL"XEyg ajK%! ذA}~\4|!c?~y秡a{Dmߝ r GO?~?[[^'YٛǞ\Vżzo}X\_*] Mfw费Ameo,2,g6+!$1>+)~qo (ckWAgt?JJ6" yRAbtPtW n<̗UNZ[";v'Vlx|TһT{\9 gƅVq ,|+X^l}7}{{{-*@8|@8k~Zk]܋?yhɜd5"c7>|e"ǖ J"26I5Y]M.ˁ*7lw!i90o%z[;n۳ǿzD\mǮYmnLO(uw5r ={^,luRv'ajc\je3B JU!kO9WS*YlUsTNnj?_^kYM`7Sk-6Peqzzq&[G|wkK͆ViM;i^1'Bk?}0 ԜQDU^ٙJj[s!+f[uAur=,J&5}kt SavnE;IFBHHI?aAS„ҘNqJ^cOi4/*S3Օkʧh(8S'_c 1KuXtd yoOMCw_Kͺ#oG LcmTȗji`ERNi,ܢ@՘^ XTl`t[w \ q9uqOL˓:<]/%:Q&ɒte& E6ðn %f=V`,Bn[HSIlf1k0y֚BԦq4RVO ]Șrt0ٰ''토DluP5\Fѹ'ȍm"ӤA,WRw*T`=@Ð:C2&MY D`݁z[}Ҭ'bX&W,P|?^y 8͵c횡knT1;XIy/iU S'ƈ,*F:JH,ܽC`TuJc[OIBmT621(Ccs DnV< ƚ4ĦUL-25(}-)᪥ 9'5:ӮA^ ȒzVoZ [vg¨spbǍU >N`*g0`BZ4p䋞 s!5 n TS5& #Cx٨4w K(N[>;o49i_dׂ/j o2[$:,iSܡZф^l0@eH䜨&dpHJO|q)n->u޿莅V@ ߋ heeMeWB2kch]n;8 |30<'<#*!#Q R`#mu$;v٬tVgg؏HS+ g`ݡzx-?'nMk|s/f$Jirt댮.ESٸT&׊uY N\Ȕ^hgYgg!Od)RpRgx~OU@**YoLR2[&J󉞑gpW{h?C>x㋏zDt4.Tcɾػ&(ŠL٪+ ֘稟,w O) !XS/8˨"ES+|G9_%7OrQ޶}[,Wew Ky9h붚Z.cD`C6'1#샍8{B#&/Ø+4!o=}R/j3ً3ɨC4oTܻ&7#pqRjA&Kg;|T>(` _&;]jq%zI} 3F5+rW2+>4v5)[^Ի S _l1͠זIjM|jBǻR+IYJmC%}4=jyWvmICkl,({ka ^ZNC[gӼlζ&h)+Si"yylۿԾSO]b _=1 ấ,qQ}wn Lw}I*'L~1x''m(lbꦉr6:t_t8m;lj|4zJ`?JxYG%Ӓ ipr0"E62;刔hJY܄g)ώ:zEğO1!], k2 >9͂QAB3O'II$FJFm#:*N^/Vگsxc\y.!~0>,ۅg&W^"6q<Țm `YTnŢ/BP^Sےƙ@amɑϾ-P)8}EoXb[,zEo-ŢXb[,zEo-ŢXb[,zEo-ŢXb[,zEo-ŢXb[,zEo-ŢXb[,zEol[hM @ynWbü$PsG*@$Drͪ H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@zH ž&$,Ar_j@@=$@o jB$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D]$Vׄqb=hv:D*. p; }4^Fя?75|܎E4_&βzVb`0a[TcB+#[]kunyŁB_*NCd9/7M' \Y9.p`z]('t @BV_z`2Bm@~:8? |hȕV}*y*[IqX4EQK,Jp+1 #[*,~\] \r^m&X`):*prUbx@L}cek(+_A}{M벤𰙜wwû>ZQ?͵LcsX+R)7DoWWspy:/k$@hn{3K?/݊1eze9R2rAUqd/Hh$i,RIg?Ϙ%Ӆf·?0rGSW_}'n@sTY<ĚFYY,LIW2,rNHw GZ{Ϙ<9f qGB=_;;(u:mǝi9dI8 Hu&U-" &9xN.5sبX2#}c1 R=K\>,]qRGa:iwMFZ 7>M^ 0]QH.ŊBdP[itpvEخF7G^:Z{2hcQ[8JY 2Dq_"!E` ,Z{0J)),$CG!BdRpPϾqΤEμc"H>,2a&MBH'לx7jlnx0 s=㬨Mt(Ê>ߎIXKB=I3rkl?5\vgqkmZ-ؽpd2^XcI`R)[NgaTX\I4~,IcY*7` !%]6OtML$( GQA1I;ױ>쌜qE1F?NՈc(P#FƂ#8I&VRϦ8Y"ـ7aYEKs)DFtQ>C9GD\%XjzZ:θT:֋"ŽQ~VpF<':ژl $),@V0ϊ{z)θ\:M> [&V;><[Q~ /, X]*6S7w]@2+bVsT kpA=gT)/svUJ+o9J1Ʋh}RqEmVW9/g]wnsToǻgxx$6c/&~٭GkB9sr%"Z2dqít^])J]>)_N䕳s;S85[}-0]0HKXu^1Ёy9G!V|ʫi3I)(Zc`g<&+t$KgP $(')ń#xWͦtNYUN.c8"jlF_walr3& dʁSωJW9yʵ`x ox%INRe 6xPx1*T ƛ N `jc8ʸ8˨8@wP@^ `Ql͎R٤,] |:,ein{k77dVm--& 3HXTOѨeQn*|)T<(tvQT=y5c,R,.1U4RoxOf,pH$ۛ\CMq_Xszp21BLdFjfS"YCX*T%!$c(I~_^似7ºiZc. b "꫌=?L X+^Bn_z6X䅰Fzf)S|)}6LJQN osciuIx8_zߑ^ozಾ6,.&KE~hxzN%:H(18G2 WzBꥻ6ԾۣM@4݌pWF>yDm}0b>YLq8vqgkRC%6ώh1YGY>IIy`>H zmњ)֔&ܬ}!\, IkۆJ4]-hz4jۻ<>rt0nJEMu+%Xl>@F]hݮȞbۑw=].vDC>ycW|wDŽ2~3i꫿Ǔg{FW!;G`0Z`;`w~ȒFqWMRP0cKl6Uտ5Nb$AoԚ9ΊM)ep r2:{BS!;Z%tYwψZt3J`lRsf"|Okm3ǸJV@nug&ǼmV &hVaσVJ}gN ?`9ՆePxa[  ݖU\,gS /ڸR t=‹Uo8wTP!fz_̴:_4cWјj?8b#  %$K#(M4;f|FK(>q|͈zJh#$`dr&h.(ē PUQ$ʄjv[ӍHЈ1"ϓAr. bA MwކsdZ^^+' }ȗ˹F*iFk|L틬vw,Vvκc,`JYrN` w;!$OG.Q@8ƲÙXp+Ƈo:~qmc]L4q`'J3^%)XHڀF+JrDQr9@<^ˏ5A!I KFU76UF4DHFQPŧfܫ64(Y+T"PFg *_=L@/5Ui+RhN qBz#ѹRH8$[s80RtObV/"nB׋XtFx!U,2cڐL#Fn׿- N]iDhC.&*qF*EǩFly4T;bmnŴ#}O-ed5qj.k|XUTEKΨ7.-YW*#ڋ9{u'+?ըz'tL:y:҉9gFPR[Y\$%4RRZeL*yſӼ䲼 RNKy:z2mJEf>J+"0P{P Z:MmнbD I&71s PK$(@mrpk13Fm8=fc/Es}@u5&3 cӑ1&#`0dgBPM*~Mt˲^L-=Og#tj~]jT86xuޜ@\Ei]n6_ bW(SH)בWT*0MB <~WytO 7}-mѱGYP MUڮ[^G1e>;_̋tUx/Zo!]6܆FQ-BYG]SymWEmaqO=Λ![.z,ahemP*5.=F:|6=n7zq7ZtÑяUu< Lb@m@3;ؘc5ijzFw8ɱG&E_ Reߦf RǦ[P][ 7gу%KUrZRH9(AA7X.xҿ%NQJbpV3V'ɬ2*V bQ8Dԃ[%e \~jo"6;>aevjdB\2*"lKh4)O&D/Pa<~mPvh(d/k$᭶&H]Ҟ-U1öT36$8n}נ9)U[4W,UY֢8^oK1^q*pcI cA< .\B R!E9Kj"gUaڠxT5LB.㤦rd Tk8.P፧:lqq,PU4,y6c r G 7ݼqy)9_ٗxaݽLb}Xnn^T3b'!5qKz~%DڔhX eHuIqVNw;FTA Y Ij  8hBJ8ͽ 2ŠIkss[2Kݵ}<.=lP9_ }IxF3 c&s@T00`/ xQa|>}]{P^IBq\-J ⑯֕8%mf!$Hb>/޹2J$F fVD XC3wEuw4ZIpS}_unXS^kL82Y˵pAb"&`A݌ F8Bdi׉̞Sb _M_qx~(>I7O(s)pUw/Fύ|7kdi`؟ ;L:&^X<)FfO㞫Zښ6YPHmW B&ePtTq\:a^U9/oG2^C Ly>0HR gb Pu*S}.uJog(A +HTȝ "tJT@A!T nV|#NY w^ܓ9_{czuR^6Q;^QB@+XP E{0p0sW!KcC*sɜ q&]o86c^ m<1f; WBZ.Ά^"SXU18/a<w=@霤W:XI'bmfP8}F|aRAY QQ)ƂLhoQL6dՍ Aznl $ TR$2A3⩣FksNSRXn\ ׮(kwNi )svDҹ=*|wg=srhIMp>}P; Uƹއ$ Ң:P_ F@i9ˆ a:g0n,`,pMQAY\Ыͤ{Lya`dD-y@@善FeT'mUS|iuOCu@~ |)$Ѻ\M[ĕEe'M_l^O]xPF[&v:]şV+r>$l!?>c 4cOM.s䦼B˶g6?ՇlZػQE"M-go6Pԋj]UX&^~uTg Es,I(1?S[v963rͯDb d:O?"cWu;p{~ӿ:WLB:d !8h_ASտw5~Zh@$/9L"Db[) a~G+ Һ;_ۣ)g%eh@Zv1Д/whd`} ֙R>v·uXj4Un: t{ޟlV>\im]11yN'.;+St+dAq&W4Y,[2 ^A0pǸu?6=I\ymDr PR$7  qC 4doePL s*&,!:膳dZ߷a7H%sݝ)/ ;"Q6wCe8\e{D[=|#Qxzm/HH'0EO ap7^HR; (%GL狻0N3HoM0%i|ttdj#iN$xŘo8UQY>H^$O֏3t " qʒs=ra[ IȵPH5/IOAzBco wuZtСA= ٍs?ȞRB<3p.BY:o^%Q)NQV`i>PT# %pΉc>ثNd mln>W% Yʎ tPAvɖY(H]{oG*H~,p9#5E2$%G_<(5h{ ؖfzz{U]yL㘟ܪ(}_}9nvGxvmşqp[xo"ͧ7bt+Yl}vfU,qW~Vf䃤4p1w$i:=jsٛ<(2첰![w$K.gH=C|*By#grRdQ4o3zG!<1uz;OHNgb$9T nrIe 6d QBםuP(fc@m^n-棻f5حy7 _$?n4)hgV{~3lަnjx5EEV~ϋZyaœy_˚ʟ1pN43itu iu|Y )@+wif}kId%BPg^L3/w16PcME+J&$@piO~XzY2tUUyxxh/whϸתs.No)J9=i ~I40jӷ7Q$z{+ֶgH _k5rH})E)Ab ,9c5*Vq>X-Xb" #zpC ~!r^cy0+]3ߍ]^U0E,쨤(9rGFT^^b ѣFr&u igRi6r\J>'L.Rg:EKbL *yb{9,;q t:N}U;GG47,7"N@H\zdHjJHҗF Q:E h\ȗӲ;aY:ղu-*_LS$QK"Tq0f Dx2\;]@ 9@I$G^a#sP YD'k*8NT4iǹ2D& . ҡFS0$@ϰ7rTVպr޷)6 U\ L6[W}|png⦳UM)+@[ L!nGy"46xjRF2oa@Cp+Ɨ:^(vkClm-kY$.sJ90,$m@x %"Z(@>t< *C!^IczG[ 1cHѸ2oƽ :ac ::kJʨ̱߄s"_kk:FWTyAќr6Fsɥ8q6Ij=| J4|5Yٴ{UkkrY?zdPݎơ S}q;fl}bJSG\oScoqIѤ]$L9Xr͜/R4A\PUqF .fJIEq86es_#Vq #@8tM.%Y=ui/Y(dY}*߯":͹A_fj]ZT}:{eEf#Z5ܴ#hnߖCvQk:^(lM}-cvSΐ?r]y:KAP'V%_v0LO}dOeOZWOў wD*cm)h& 8]D#%T Z٤W]V{*X ("[ Q!8Q)E)%~r"d\sj/Գ'N=Ǩzgto :?>ANĜ+_T+XN^Ry F*^Jkk}^:/i8#W<&1Qd죴" ?0Ԇ~=5Q@,&EbhP @-PYíN*d7rz04*_&&E5 9)jLZ I1 Q˘Y掩n쎕 A5OFn4﷕dBiy:ᦓm7z_.ݠruK|2oNUn*Gz;!]L!$UTU`o YN!Ni@糨=?#qE|_4R4B캕jW.~ {Jw1WgpmuyМ**VQtuB&YCƨ"tdo׏zDzksbvmc7Dؕ$!v5bWա UY 7 "ƽZڏjSq&(Aseb܇(ך" q;qxfOcf÷ N~uʴ@^z֭H\kJ=!&ji.*`wItOZ#U!2n=δijhe5zqt4/x'gqQO.uǻ>}m~|L?N:L.{ %xmy f@pq VplPkP`i6Z!R`$¥c1NUr6MlA*|W=`*"hai)<1 y`NE=`/Su2狮z瀳lsJJM$b*khe:%* *7F+0pV'}+oi;g^΄9wV[u5ӶJP[POq#Ԛ3*+PMk%RG|͏U^ԒdW@-͢-jgɢ=L~&䶎V9E8IwKl>CFRָI70>"WqY #KwۚDJι+q!9#w`C/ o2"89cqr -u<fvod4BkbHk~4[f[v%'X1,,q$.W3@g@H%^=G$2OZ%bP?J_#q4IކSFp!(\Xύ$!JʘT&H@xFf @AFԒ X=XT^o]1@C7(k#c1@717g-mz[R{z%*pFQr $p LuyKLuF8S YLD9DPVMP&o IqI'5UC8$LٴH]Ou42QǸYw "8O`Ahg9qf.<1* kJv{/^`gsk{|Mz}#jr?T01^}#9Jo-qTNz)JJ)}V)HWQ;FTA Y Ij  8hBJ8ͽ 2|- _0y~qUg!ЮB<rA.]>$c(/Tγ+ϮH1_ɾknx587BGRh^ m|˝W˻_](&WR(Z^:l))1Q"X% 'pGz^[ǒL9L@l7WpD$.}@*b"9438~H\ّMqHlHַ|~yHl+AUϐElڤ6`KLwSOuWw v$rbvay)L\a P:Y,Pj #1Jd|IsYSyNš(t"8WJ5gƷfii-w%I%ǰIMJ{bqb2v/dg_xP_j.35ӞŎ^c?ͪX-Ïޗ=W:YS6ud< 1A"LHp螢jZ .[N<!{60ceɥA&10LT=)}Nln8 !xbk^>9 gtȤv`! L*yC s+A!ĥ`-CK e/za̐FўG&D%L{}~`E REemaX9a"c<X zD()kX$Ho9>5Hh631$L\.9nil)!?068ͦGԀg\p^CQRţ#i[ERϱ)՜,"ڡ_S:b(/~Q/>g3 f9Qh # Pc$)V$wCDZC?< ?\sxo1rlI,cd&zw~Jُq^~à^([:K^s4 LPRXDHfFVqY"CQouDrNe7$'#Gჩ.A =0_)ʀ)4OYj݅B~T'jisi9b\1oOb>#wQ+cdF%% ""WU`'y.4JQ#e g@!:894CBb=̽7W*04F~ݑڳ˛ӮֿqP8G,fOeA>8= Ip# D M"D)Ljo`E D8+:+ku ^:-!464ft4q!AAk/ .kζh˼oXc 2Nb@Wa=s23AIUH!Ӕ#AӳX0͋ŧh!TA5` 1CF0FH&, 9Q*KaMy#Χ3+B,X!x.%}rlUN0Rd<1=s+91;NK!9d>1w7t'oWjl ?5rUl ~ x؏6&Z <S -:}8[k~gs('{ZfWx}N:SYP]3T!ht6oWG̳U^lM.ήVH;`5zꇪMmԟa~0~FaHWsr'ADjιYRSm =`kwY XϨϿ ^Zzvao{M ~Yf~,WjݛM7ˆ.y{0 gǬ:%lK:t$NIO<կW+n^!'+n _8fd8;S@;”S8S=xLyԜu$kOgߪcS![K.T;Ɓut&Mg6ggrl+-EZ>q%!gz2#aa)]~7HX Şz)ptz6WZ5x38VyP@p9;sVSVE׃+{6a+U|r^egwu#1*˾&=$v6ֹ-ޭDmPW^ZU{/Tz:~ܤkMMG_..^;ywyg## oYwKu]'Ryj-:uyю ۑВazzʵA].ҫ.|%u J҂'%cATbu Ŷ~jf8B5vƒa]LW<%- ֹvN5\UK#҃,]*'/6wb\0Q/ۜ;ɸܶ!5 ^yuKq~Ͽ ^l[ÏY;Ӳ%MǚUg K+d!={q0¨}N6u \YGL*꫆ aOw䇷eO s԰vTχw0Iӝ \WCDޅ\\wa`nfqZoEў2Njo/hTo=^fю.o>heA ?e{m6hO 7(mNK %;&uxgyzZ/1!={GTpW3V +4Ku +$xCj/-#JYш2.w݅2cLS:Ʒk$Յ푋 o#֧|\>jS>.N/dI|ohz`< u1t$:d򈏴ce6l$ECGgtuޔpy[iY/'bf%=)fh6{0s|UtiPrޮrܜUv1yPV臣h[0OPB?ߥ9EG~ WFQܯ F^?b;Zc'74Dt>-N<4g^?/O+M- mk`򿧷>>e?BY1uW% &F -PSg%43}7sʦCYWCq鬭ËM,WI.zpATrpZ;d8Z`h[p#[X4Ćqh! $ X (OeḌ` G@kkKA,f%!$H^G ȢGxP!l Ƞ沓6wmY4ض,f3@ݙ`|"LH{(=pL6oUݺ{k) vc[1N[TuLr抩˔hr6 4 h`Õ1EB2~2w1`DhJ` Ü@#܁?U2ӉTIXs L#C:b8K4YF%Gh3Ko^5 BYUۣ ^`VAc-B@̤A@ |^3&-K X{0sɺ6v1'Xi^a Igy.j #e`09q=)糀_ φ|Tk>spݰ(A/7C|C G7+lE^H"\x>W+f)oGu63 ;b{B!DE HbQ<T Cg=kU4~ aQ0'#80`ci +725H 5j*\#|r\̤XH#\5%i\4 Z`ɑ݃r5Wh[Mg TfĨ:8F(ǔpJfP z̀|Qbe럟F`JnÈ9>X2Q )b8e+C`3fR [0x @p46jgC1IĪXU]|3 RLȎYvT%\FCd]Xr준)`$diNfץ5=];gGRV.qXK0NWΡkk{6=A۟>}.`R:{y3oˋ^`KGkH.,R9wl*ߕxXr+i覹WՇ9ޭmZX.gg\ؗ h>kmW++\\lZZ͖ݔi0F$ {G}fy9mGۢoӇ܄}:_MƻJ oBou\'Oܞ ~??CUdsR`Zl@ֻ'H&@qR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)]%oTI E{Q\ΟJ@Q$%7R TV@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *Wg랍=`+’[TI9&%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RU!(@0?wVJ@RyR}J !@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H (>-[ZoūɋԴ//O@S)y|xF{j:kI @&0qE$6q<笯ɼh~bl.2M˳g颤7x]VY:bVJ~ԭ \;RG^7㝪ɉ^~mud|'X~qNoƙ)c> {Ȩ27FZeJtYcJåQuY,ʐǎᱏ4;Mjh0 YYqihq&gyH#wj%oO=1DYLrۦ! f6.v1ʼy@q$h)sHWE uqs`Ȟɖ}Y>! ㎋pſnu WGyw0^gA*/:I㊁]^=-3HF0X Z-ΔYX{z?ﭳ.4 8ԿJ_{Ǘ; v薯g:3jN6{s )~R0~_Ӿk, Jtu1n']?GtxQE]o7vK`)]ME)^:5pGvBXk W=Q.(QK0=+#wMӽ14l9U`f-rwCjҡ'|8hW$7lI-߾vsgGZY$ȾmvC7{>!.vrlwKj{f`}2j9)8z#voy7 C|, hnyКi*! Bv [k8V},"0`55Y7aj ǻ*fRJD7@+L.}q N휄Pý:A`pF cG s`)𛳻[÷x~O/uk؁+7VEH7jHX; U J=U ^dl$mb6.h[u38gqzCE[&e8"xp̣U3"KVvE%$EQ"O!͚߿ A+Gg걢p2l\Bvt"—dҾ2w,$ G[Yb -c }F=~qa9^#&E\7e{߯2CMguvIpٗXivFj+hc.(g't7O9`|8Og[p-jHՇH5I/_*+lrUꬴeUZG&q"0Vmb6M&`?Nd9}|kq[s_Vgi/>mylO!{5Ge1ԮUJo7iO1{4/Nh^YU'2}JbI}u;ӫan7Of~yLUg9rC J^o&Y>Ԃ' >5Ӿy֛A?7_LLjrn7^Vԟ7*k=kCzX!(K>".#2.LW([:QFۈn_,ݧFՐ<:TSJ8XtX\}]D6I7X,WW|Gجy&m/lSY]|a6$XERzp6o_zu˫&ͼ7~a|ӫ#~Uɧ-M .n}d‡#7ڒ &;nPf{=m3g>;xу{9~{]ĝm6* r*6)RYUk0i͜^(['V;~C#=@r:l_4EAVCnUb V$gTTʒcќet̢c] ,e7JknYVחC7G[MbOJo/}K>}DGƅO>V]^偦m4.q“pӆA,|gI;'q,}DLg}t՟.\*Y:NѺ s|ꩻ5_NQlεʊGEZ=¢/}->Χ؎qQf@ꥋ:&)k@0I1̨t׎LG3D.>^+s͒iR0Cn6ݷ_ow3o@ibF5yA+d\ wA2 !+#Qp/QG!ڮW6'bo 9q]Cq_yYQ9c_{dRfX! P|9fN/5#fw1Vg.I"?{V0O9v3yؗ}m 0QViYrKJ }YGۊ.\4sQ9$#vYEJkId+Lm@&dĒ1[= VN 9XJTDV !@c7"dL%1 ߟMds;L(]>]̸w5s5e4qW'5s6o%n~bn}ZtP,%$LX?5tNi[[QoYjWͪl c5}.R1o|;5!%ZTMR$q>zo}Zl1~U^m4qg=-F}B*4#ޣ+x['MֹI0΅/+m*xTٷ1.o!F\ _|k[[+*Q9F`B0*h;O+ ksJۯU)i߀6=\_V ]uW>~,*X8k%M;iШm=2zw}Tk45 ;2iK^9?Cjcgw)tUJwLL}{<ϻ~pxR'Qy܄,z |W\'9d4"#e.kmet*}q^^{#?}~G`E9H2ѕBZ~j0e\E|C8I8I~^؞T=s{XoM:옿ҎJ1_Q_ՙ ȝPѲUN!:m!h+h IKXNQ~|ưF,f_Xܲ2̾tU.kTHEJ/<+o}kx;ʿb-*u75kjIzl׵nyvJ}}wo{~sۓ[T|0No}xe-1gY_^z+~Bgnwpg6ԇno/'yhwjIo WCݍ m3Zgm>{.bO믖z{*BTo6$/_-j2Mɕ}tDyJ3R[72@$)g:i,NJA92 )K. P!iʆL4%AmIǁYΩ2+{>XK.4>a't Jl+i|p/׋GZZSHV(U޳jdF_ANBzqZLV*T(I݊H(M>RZĚؔ|IBA:`x؁$kp8;(7ެ~ !{OcJ)0hOǴ/wWOuI`>f7p0M!M1tE+` 2*`p0ꇨ MWNaa?Ý^wJM,I9g6u@I*b֙XKF5 iBNݩ ܮ^wb26GuȣhE3 D\F8_>v{oɞ:]{%ڜ̓23@_^ ?- oӆl"7"}3.EeZ{l~zUx?G|:e-)bK0 /xw_6Wyp^o葌!(LBQ dQ"b3K:o3(]&z{;@{mxvvB{yXlJH yN뗲ӸYT waυfȮ+;tb`֚"tcg8@'@;'Zx“g>Ldٻ⁲)?b@Vz2>O(=(!c`tMw112!) E1&b6 DQ>p UZ7o6㻘V;;_q=Z{4K_Kl"{Rz!Qe/>ٿJ.;PF:ٚsΏpF=lԬ f$6LEk3W«H+*zx6'2v3ac,ɉM[tV?wX~hxI 5Ib=zBjVѫ(q(Rަ~wdf5hڻ3'v̺p1E1۔e1 d V&ZQD_rpM'vxa.CPt+xb1V[Ht͸E fgd+@i%r]=C:a{ڢ{N &2T6 MG+AإLsȟoiR:@WT5A=R.w~}}=6RTfg4v^Xi BWI"4CP}I)OBCbMd!3(v̪(gP -m!BE@Lvl mؓt(22v JKQ`MP(eH3[ xmlz^ ԀӞ(SW8cSOdl}oN~D9Y"挒fQ6!fYd2.f5v(,VKr>#σ6Qztw.WT]K,C<vo^ǿay ø'ZHcy],s%|L:^|J*Y g<86?(;,sFYAL"QHi`Pwם ZcIAsJF)bO2IHO|Vt*NS [ %,v)DQ1WTX]'tBJhٍd'FiY &2+ld)@lT"+eZhrtNXQ 縢Y߀a)Ͷq!Y,t<=Y ͩQ>$MuijGn*k\a `b=  fh1KvZ0jjw%( lz(b55e_j#VZnѡjm elUfq-ƶ[cE{ž˃$ɞ< &_b'KT.u:M1QEQDl5蔈Ftpʫ1Tg/d*FPPͦ.Yvcnj8-vN\v38jV[ V{@p&>j(XWJhcl Tv,ր2TRͶXR;FR95D@e6& ȅ͆sNz#sT=X"x숑I6?(kų7acOf["JV 6Nm6 z)h%IV5 ڙc,YؓTpRJٺM< lJ)LJN]4],?  Jp3!ЬdDVvXJ{X`.=l&biLX}_ ȣ7 4%$!xg?N?M[ W9Շ uNcvjADHx]pv:~9_T@C"$Buމ-'L'}"$#Y+aYi0g"I,XT٫@P =94Ax9:Q֣Qvu%!4bEPJZ(Qh(0f)E"6ٳs2JEpM6kq^$W;P滵g˒5gui/SwC76paw}%UA\{w^s餃Ԣd@)Tdcc6R'#| cAZ'O';zFE YZ`I:cN JdcP*ZIL63b^_ T Z+@MAܭ]en`W3@U$'%YK4@E.T] 萉v7@`l{֌K0S.PG%,*yweѧX%% # E"f 䉳Ȍ4I ijYI9[`g`*=#YlljL䙈 ,*CDB7B'~$Z*z{ImiSwP?~ +yAŧ/ߌK6:^f\qjsp&]ϚڤQ3>Y~Q_|ֳ^|;i׼Eŋ9f|t2m"f0Y{Jxе=ݟk aoO;[j0>1= #rjCbA"IS,V?Xqm{cwkN?KM[ٮC'QDFC3?OΘu<->rjYp;o{tx6:_im)YC?gC3c(t+mۦ jfKhΞ'Cvٷh*7>pnv2.$ccU਄q> 'K@ &}2>( =͔Z/'H5׈t 5wrwR #\Ѐ,j1| ,yt_hmvp%+Xio7Rv)-VŨ#YL<Z"we:U!a;ͳΞzw^j\|DR+VEU=Z8@̩:MX$3 45?_7oc${wi`g qU ;2iX׬[w6F54kg&!Z%P2Erm|37>xi)/hZq_SSJ%"*zF9qMB /(qEgF(ȥhXm|'Serdsq ҮJK$SXN>}}ũCξP2^(.C6N=5^O>SŽK+-oTL$ ћQ3g)>Y2^MMtS]+!hG'y &'KM$>ިcFۯԱ4g$X ?I֦&MD'e~nWp/qμ|^QT&GYUG9M ]񭛺j!J?ִB7ݥ끛nRprV1?ɒݦ:uq6oH=}邍O㪏/TTOd"sIFRV >f' %.ŀ$Kg i&B򝐿oݙd% H)ÀSR$Wd68<#|:gI^2-2y$# r:mluVFCNAJpLJ>խPHH)Z2 Pu4:C87]@5}l.-ܥ4nYIo@FjF,A0?9sVu_V&?\Vyg-bL'']k% n˽sG{HzZ G?/ v[DrF`E\@9՛ꕪVNd⓷ĩj X9%X3ğK޶f4.C{uc}:o=ev}zRsW>˷>ԃWwz@vj՘UŬ eaޖksVHN2ĊD΄b޳D6kIYnd&HV sĬ Yzགྷ\[.NP{Ol'W زa7LP^v7ݐVTH)z.K9_c¹3.Ab'׭#en(ujňEj0ZQ78 3ۍY !NjVg1˞͚3ѭ;@lZ[-,ZRvr9末Hawungw.m&&n#S^^W69BK|骔aEtq/NEތ\a"xr3jI~YjֺjN8_^NfM)"E_\ʜY{MB{ 6.ԭgwȾ{w*?7+,o92K[vƽߨcFjz(m֪]OÓJuMzECYVd,cMn¨[*_+q>{tŸ Ɨ2@z- ׮˼b4p^%sK.VxIkm_l^ ^V5E+V+Xrw OW?9M?BNaUdi]S(.HT>̘ lYb۩8pOӚ0k~+)'vh-ȶ,w^-6z)E-̵+D:Y1Ķz !$EaBoX 独ÏxeJHtRqv?P:D&#5PCV^7+)ϞaK%g9tԚ}:T,w9'HI! lj2FrpY%x'}fKktNҺGWgTtb BjSJ HJeCNeN>3VMB$ݧ.=FB 1!4O~-+*jE))HeJ 3m$<L\"崤<(tA0C}pË+&IHhH&! W#$mmU)S)֔d4Ud`6bhθ3NC $\R~ekuΊt ƣ5j:l_ l0: Ύs>K6;qu~k!`C %2ZWTqž׸eKz| ә_I.r^c1Rb +g9E DGm#/&eDxAv0` ^1"'kiG}K& 0<8]rg` c4ւZ` .`;-ra ǤU ,(y :UVeN}`y<pL *L:2ba-dTR-NHrW4d ϐBu_FH(:QX R: S*!:uJ@"BVT  D/^ĸX<NY'|VŢd2C]4<ϊH8 iBUWz-Zw}Gu{@Z/٥ZIE)Ё'BlTzCes Cq2F0-"1}+$\rLhjQWBE(XqXD@0pxKͲZCW5`OaeK8h2jC&:Sd)i% *ǰN%cc;^t> ja&,U0^[:pĥq@8cPY*&9@r!V30[$97:/3 ~Ѷ$RDN+We2O Lkh8@ #"Arѡ^F=ЭHCfhm]Gs"Y ԏD?^_żllj6p)ƪ@'n "Q C .BDљx  =L)=`,C)9h 6#N:G!;ja(ͮ$}m>A1HNIjjuKƂ0e#Kjj7Fu X0a'ى(@AV ?KFt)d9mZs+z 0]+B:hϢ;+a(XJ"4Aґ, B)t[q/Z7(ݭE a +ZR0EKyZ,6mc0mHr%RV0zDp31i02Go9(]THO5p7LJxKdCC*Cϝk-aɁڸ)7Zt17j:!c` ċ^\ -?M89W'Trr,yxS)Z Brk@@M_%$n5\wuШE{N=+jW(us8=tGR=QwP󜏾\ɿ,FoFvϠegi{͋_Z&zX1>摤8A[w6xY{ѫ[Q- h1?']ϯVdi?mxANOr+Phy&dJޏ~#gOߌ2j4~|}Zʤ+A?>)DW}S?!b+ )j)F;>-^Z6X9Oxw)RLƵ<{3.Z A< Mc $V⅗˝?KrW4[¢̆ z jZz;oSa~Ovveh5Oh9lY˜wՠuG]w[<kDUtF ]!.hl&-HֺKI*kLm;E:Ya]c3CNARV`w鶦oxf?e,A6kt枫RzG[<W~^ڼ[]vgWduUQoY}ݔlk=KNyabAҹm|9\G `e I aVs3FR97d/¸ޡodE/6ټ΄ܓA,2xR3]H_xZ\JbmlJY%Yܟ{JMLI;鲕)&x-- MU[jJ]JE]Ki؉cuMC]v"V("+!tovҩtѕ[q~c-Egn</O#2ϷG5Q*}mߺ9Ɠ9NկFz5 mV]6~rb|W<*o~W(@M3~yosڴQ}_Y\]KeHSya[vN21?=>|4"ehѦioY__|?kQ mf|YeE B7ATjWg{n`b.}6z=7lUYWuZ#BpkDcp+Z"J N[ڦz.ߟ)W'24{`h4CB;_w+E8kyg7q{CN#=mO#ǪK~Js_wmAY\ɱjMٿMBx]ء.oVcsb϶^k0ANo. \]4}xPj_UF~>Gğ ']Yжvђ@ $*;:$w(%kkrWgq:}7%bUb:_|:-%)Um|}7\MƛR#˩WbjgJpj O%um[ n @!!>.RNGo  ;_ ;^bfzu}YWOձ%a1E̼󭵢MY~;-FSZ˹GMo!y;Ywܡ=r#V̑L||` ZdȍA iBtF W?Ǭb|pȅ#WBi'/{`xYz)h.g%wIj'0i;#0QUkK;qUgYu ^ErkeÈCr@89fkg _T)U,N:Z;ңvVR*ߍ䑸 fwD4kxG_v-zL=z}=f>;:AE )%S"PE;>~;WtJ,m1VEY|m۾$CQV~= 0z>Zur hC ! |0WuݾX~G!9һJNRF[{-{v󯰏+!e7c7ⷫ _G)Iӗj'^}uU˘pj%4@ŝ..!˂moYp8}S5bhM 874WѦt[ /+$7!Ȧ&.]}wMO1u{ \G5Vn1;bl;'\fyѝ^ިq:ܛ}[oP~_Ihy0Np-~<q%MX_W{՗@w( {"c1hM<,;ܺחY\ܽZ Gu0w\c뤲4ד.] 矠"bn[`Om޶]J穴R|]+j>_E˹PmήKdž#i|?Vu-MǕyR=I _s^kbRkjlSyUlWiS靉t2 Z)V3]3 /f s^VF|684\ڐf JdN >H"=K9IMt mrH,סlƕ[MֱzZ' 83gcݾv}!減S,~^31Kն* ,׃rK)&^_1{1,FC(O:HMI5qdl)I3~)Q,/u^8Dz_EOv!Ag 3au? $%>o#i.ϼ1<基4 'N }>1vpS&m蒶(_߻S JpDLS) u~sp 煎Q !Fб|VdW=LWˢW.A*`^aik Q (IJlOު\yܺG;C1QF 1۬OZ{L:M[]ٺҘ=nMij5g™oh_t/g3U"(LP>(+r t6RCjtٻ8%WJ<ق_ꜳG݇ڇkn`jFVwCiM .$YYqmf͹g..-V\$ɴ>r=,k"zySpj)#WU[LۡUsZW fīτW]dqz_F)]Y }]^}{0d/R2(1%Ŀbo˶|)3l7 !:",.Fm4E,"#BuYQKW`b*D!xXSo:g 6IoCB`QI=#)KW̉D!R^Rz Ւl_nȭYsՆRJl$ (GE`q!%5BfP+NvIȓlEB*2Zae#"DΉADQB߯حV2JJ6m>nZ4!yr"{d㥔w>:Z 8Q݈YM{=k&"*_l|H{8S,!Ip/ Qyb[L1Βգz\ $d+7pX] ;YN%~8v+DY$:ϳ@ lxtyxŇy(i1[^*Mzrz<NjSl@$fI eaE$!,[@;Pc6тH*AP[ ]ŧM"PeQlØg&8tzj~PwC # h^͝<9濧U~S}krvD_Y}86dej\ٹI%@B`Cݑto͍.֖Ǖ rZGTl: ծ~dccALe `Iқ,b"C,[dx #zj`\t^J䌷1P6c޹ c*cR!Ab.EZS4tV.8⬍ Wu7>5WKrXx e{x=3 glQX˥E,ZC.1a镈1ʘwW86 b7]_L&'/l|mَH뙐2Hkv6]+[d6a{ 1N5\*$Zf7NxRܕS(:d$z3[kGHU[g$vڀBK&ԒJ&Y N1D$\ S%_{襜nvp(\op몳R,ħY(.3;%(4X/P1 ҡ`cbX 2$mu tM@X_LPQ2ZYV$Mu Hk߀)._Ӭ9w+|"rKT j,x;ƹgEN26ev1Ѫ]٫e-#w}&ӄ_L :uGg-jXcTޤ 9,s>p9ؓQխRwdRIKWj\E F* "z᤯PG,u2R{AMFy:픚ht`,@EAG"Y%5P cj|΁\FQ)x_%Z`3&1k& _g([gƁ Bg`ҡB7o4@UNM :@Wt)i (q`ϣNK;1rq j:;X$b1~Wc ҠUE撘O!.ȸB(" =s:nhF56 ܝ qVaPYsq5 3#@oƓDkcCqc(PH9WM7%eKQW3Cz1U{sWo4Mgǫ+NuRhv?{/κ踓u{Jo?wkw|kzͯojK޼޼_ϔU$lrיk/u݇'/=JǼƫů/;lp>k'}yM~J% ݝީ.q֚(ŏXj+JV.uc;SKVbov@'Z̻Y/2G:1|7/ θi.{N"Ng߱|tw)pao?P\t wCGACiL#|:٩"|ymٓ[B*-{b6T`eo~h&_[f)b&G l fX6w pD1=#(J}M| cacU^߭Zo3y^ZX7vղ:2K~2;M+-$cuCnKfwçAt?{sЧy +i^OK%VۿnU%ep,~?}ՠ \(_?J zgXĺRWowm7]>^;s$NVpF5,e)=Px֊/)mD{V+֫WP `vsD4&˨#]-wƆ֊ *}V1:C:^׾uQmթ[ƙgYѩǫ 2w6ɾ{kN[>Y21#lRXFT"7]jt=hZo>T2*s UH6'a3*_Q̈+hLE$tT>ssFqev(nn2Cp8'^s>~ɓg9vEz!vk!7:=:>~߸?;hXپA8GtZՙޢhqG#8zfpD(!}QhB.BhG*erH+BA.es56dvF5̣Jva/mNwySc}D%uRi0:(֣2>J 3|>\O=2*ʻDo,f{sدtI5kcp7boXx_}N>OX?3kRTΧ&IWה𫒓 Gb8+YvȂٹ3&΂ҞȄN(c1 㜳aD9[]fTb0UǛFO-l` A!!S T EPz3VCe{Xe=/'C+t! -b Y s/!bK RRUT2L#D95LV-d$^iC0ҁE^zUsVT:8&,I4ٕdWQ.2==r<;dJi6ª,|U]|cwܩR|%H_JrަV/ +6R,[A6fs#K;ۺ6DQ2i٢/m]`kREKŦrEPC)&ՔdiM֤Fel֜-c;6mmjl h ^駊7 {.Îwǽο2_kr'Wn%*hJa f!TL?{WFOQ—b nwf p/[ q,CRVK-4#mu]dU=dES`\ZDYQ[;] В1*lj)2e<p*; ]E΋K%L%c-L80!!E> P褋"\bΙhhFy~" hXDNxB,y$bEZcAժln碔*gQguQw5wR=Ӑd-jÙcrTI+:XII4a5˻UGb;!uf%E Ox*?OZ"1> z1">*(59:.>. f8<|k-(^Z|Y,ˇ.F>8xxuvXwZV?:}u(/FXT FEOaʥ yPwQܢGk>7(KsS2㒓!v̶q)29)ȫ F7g ]P:e\]xjcޟ~_n)l}X\~~+ίX] Ԥe"f?<dŻ5UƷAς͖{3> 0trUd8d3.0q؅Z{Mf jDvH#n#9nw嵣J?-ӻ]b5wwkvw#He$"#]+<,tza%32!LK2 좎`Rg͗ z<5䥪 ;Dϯ(getS*4-H-ؑτ"O!]4AwoGh6JJ*LXK:x'Ti+*(uѼ߼N9Mj1೻1C#C );; !H|i ou0]W1pjɚ$'EpٗHbic.LZkBZ6n,M;.M~<_,V=/kHKW9䭊q]KmVg-e+@:jˆ6&IwAwa|ZISWOpgZźzQ_tۗ//l<'^]58 \ ͺ|jˆ{'[3/f3y3ϐ%EDWg,~ ki7a+%4*f]q&]TCMtCw_s[wq06:6f塆Sz2 |χZraCk:gaח Iȹ=lSXzɣjGިT`Iׯܑ\*ѻ/Η7(~:uBb1ձ믅VC~e)W;g?AX- 8I;pKlVs%y|ιg` |Y> ^{D}R:Yaq6o|=]Swg,R!lWa`66Gl=y7[lw2ځؒ]{θ.[)b9cp_s}EmvLIYYBTl )˜QL 1Y*\(S[;DC#@ax\^2!K|)`Ur,>los,F,YPu)$z^TNe`6,^R2#ƒǪ#<}+QjO:sϴqϴ{a<6L&p엍^SĂ??^7e5L|.`}LbvQixi@?dDOIsJ%,<-6oһO۲JG,fL 2&D.Wu~p_Z~AʼZHt,\yJyK.D՞;Z߷uSғUX{~zY__Y~_.V ͏UqvIfLpﭺS]/_L Qވ6>՚}fjvHSJO~Ziبh{擖HѬLW1 │Nm5'EdYn עEzkhSm`n+||O_bnʇT-ç=zW=`kNf3EAtU~SbGS$G:xAvu(83|u8u ~UV'096κ=Yޟ4{d)O|ds_aqzW=dOfzМUhy-9~{2gU;Z LY'e6i9Jf'h%_c%C~XQgV:;>>ca(ZHif)>i7'H)RZqengU_ O#x+2sv~НVejr[S)`mz<iGӣ.h:f$h{K!]gmgNٜncN?Qң82;FE#/]2;fC'_\v4:n bܓUo&Lq㎨2fy@c( ɵa4i({)P(%q}6FXP%(rե2NrYEkebSIM\;~?4O_S!_)\C&wKyy{@p1/]h\h'@Q2_R`UlEue쩸-&+Flf h+A2 -u=.GU +H|C(CZ,۔QƆJYVdIՊ!=;[I#݃NSZN ǔ֖J*'\tҗ%s2$*R*u2DQZۡr}!2dKrTm5BJa)j DFG0)S,yS$"D> 0pcElPcU%ccRFG%Q) f'EK3yR,Ak2fiwljm35BҰ&aB%cPFw"Xv.EëVgk>IW•9! +' #AC/˫. g]HR%ӚXZB'.?SkP9(l^%VbC Zc$pNdEމ ^ƠLΚ'ШލKS1sRv ESpś\##+V RCzFYoVpL@5 !JdM`!>Q.jm 碌Rq$ {iPXdGIl cHJTEv,k;}0$<7 Č6Z yPXudU!(DGkE"6DeE@BR`/#j{R),و"@Pv ,0:Y-ֲׂ:pUp;eq8 mG5&Jb,Pa h yԲD16| 9!ae W%U33.e]\ APj "(khsQ , ;(ؒ+ TJC¢]wvLa+ YId-t*X$ V(FHydsBP 0;Iҏ), xDE1;"dUG0MRdΠ|!2M΢ɠR;He6(@2b3.WR̈i <o#VH(@Lh-W\2IJ$KUE1* >+ڢQ͠`xp\-&Q!o ̐6 KA4`FȤ} "%+%ޔ]BF0J8A1nPXd50K, Rl ilRDBj6]vZh FGe] % ˒2FD0 ϋE/S* H 2 RDQDY"H[0rZxNZ W])1GM&23[04otQKo3f<kcD%ŊԨIJ$B&}(&f04]|\,|>yUN( j=#jq^#úec4BĿz1_a+@1\ƒr\ \1DN!FuԶu@2apaBl3B4Z1<*IUkzY\:h\i aQЏ'#80`ci FjmRcZ |,$͙˃>0t$ƀ?!],nY##Ap^!SCxYk$}(ѪdL2 G tiY jCP/jC ~4S"qFd>8=ZMfPO鐣,2 8'@󋆗 .Z6l!&\Ko&".ńE@MpIF%. h&J#:KckT];g˥^m!:[gk+;]XHkc6=RQȿ#\!:{yVV[kۉliC_Ԫ̃[XUTUx)qJJ o gt@vc/cUʍ-I tJ ək,)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@GTUx[&zZ C dI H wR[)잪@/FvsrJhR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ7RksJJ ;kOG dd@VWJ'I @#%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R*PI dP(`'VV::M*t %)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R(XZO.ⷳwLR~ xTJ/ݟfFlmŤ'Y/9i?zﺚ5&NgQ}֯< g ]v?<_yCM;qc.?x!OR%kij{v6 lѥ+-R>Ծw6;؁Ρ4!%[rg@>6'}A֣6Uʿ2Bt/0O}A &m\]$[AsބL{|6w LbSƕb ̏ϳZ~r)WO+]\Go]?o'qȏ׳XKէ 2Тퟯo~#'mCn?-ۜqh+?Ϻ }m_n@d,dpo? Yck)!'lj#N':2 lwn1o{`s_~Y ɰݽ Xtw7pGE*WʋZgN(w,o7x*u~Mn/?ooeps~N_ρwO0/ aQ[yMVs>{ m =TkRՀ2/ï?މx O-lG|F:Y~k<Ѓ n|G/WYxftgn5ܿ@)tzr'#6e*,76wE|RF63_gƙδG3Q*Yd>]RID `p>[ʸP"/LhhAQq|#yg.1KZ.̆S,J}_ UUb0wubr^tR-FcHm-v_nbX\ΊanSbwQ, AI7a_?f}/w Ӗa1V^3ŦPI;^IqyŚM&5dԒZK'⨠sE#ytn˩G@e;~Ӹqr?,˷? 7u۸` Xld UjJI-ÓpJRR 5:jؒ%+#r|ӰD9b:?"ݯC[b}?BhYLnx~~-ZA5{5tt597:B5̭l!Cǔb%u>z"VQq\2`z0]~zL_mna[]g?]z+>jEqߖz4r]J!@=|[U<(5fβ1r1Ub1Ej`J\wo۫pWjmmH\z݇-wV%x][Ҍgcq+QgsT\K$x(1۶M䡘PF8*[_&۾J  1Cke!O.'{2HNGP"!䕓:Gq!ZAt39&ēߎ|ck<"3ج~4$nr!R4U}nûTF~㸦ӟ.[C.eZFچ!%ƒ-vna[ [c!smgÜjQ (m|c \d#jN,NU^psո4/6/ ICx)RFK1^xXfWKt!{E_罱Ay+)L>ċᖮ[֣@->Y-H&Zlv!/6YuJhtK:/n|%vn&cl"bkڀ]I[BK)%Hu}ۑ"<܇a)ޭ>Zl6z)1D&eFE#uNGjRwTעau]v*1E=j۲Vh#H: c.8Nbޕ$u-*:9o l׻{׏(/m$P )f20wG$#{1d}e2v(!N0CI(e ޽*xwߑ:y7{x$]ހ On[BZBZB Zvqud+m#I0X@-) ?ݳ,z؇a)H ^!a$v+Э Zx ̂V"Y1 1CLp6bES9"{P$"f::O%Waɏ|EUߜU5~i`peRXݜUz?{4US?_.|j OUIegՇҸj/GxT~Zу~~;*UG'U^5%lr<UvV]XONSGFx,Ud4]G՜]M[):OP̞Bg(uՔf8R+KeH*NԾI+F$M_C?fpJ#83ފ3aʐM,ĒW]g]1^}%vf@D~1"[ N ǓT2 ˹'KkN{>-lP ޶À,quk ?sJ{U/|ij.{7DE\,XBe3UFkHSue0UFl뇴A[^d&VAV2s&wLG(/fXm{~4_W4nkjX%jP"| ٸ96WeVXN`lsS[ 谤ۮr_B9Ay&\!,G .X&u9lV1 ;sv7okϿ(>}~gvE-ZeR.+ȵLEObxjHµ[>Y[uf"q(ӝ#n r),TldʀB+/|fDBWDCGu&vrD8_7AFOr uy Alc|=jig5OKsF-Qq' Dԁ:YIxQ$8Τ%@؇GzxqhO3yZWlY{,WoR>WO3bu g>~"nEJlA4<%5wlt^5_r%\ IK}>dSI YliL h"ZbJ1*YNjIH-6DU`rG eHч@ "pS.s>/K&;gP}H~{Zߗ$ %sGj.eJd ΤlKT8*D5Y@vU>+^>%"r@FbTZ:a6vsՕ8 O'Ra0mw'{g%_O[]d4^려@׷i{>vX9ak[z*Y8o} 'wCq \9)gEPHvì02;mkvJ&w)X9 L1eAF"[sA$ $Zۤg3q#cw\3,L3B1  g^ffsv]v6qUmNpLqĎfo1'șlӢ6!b̄YЀHv{h`2!EMM4$#v̺,!8ITFO Z]xV\CAδc_ԖQg̀B:bemrt\08ALT\+aLmEV,!r /:J45 Y#:̓! B}xؙ8u;ӏ}:FDGMo9b(r2d^RY ˖T ƬAxkdAz"6Y񏒞@bzdK\p&-F͍D#yC!vߙ8;D!@s:δd_\qōQ~`ȸ! /I.s:*AdZ5B CδPPa't_m3ŋvܺUllʎsv_)QU4q|=WJL,Bu\2Z:De9 ԠL,CB!ځ{} $1 H=$OD@"\rFp2R;R'ߢq>@ۃT-..FN.OOp`Ǔ I|3?JY wG<  ~s#Vx]†m+Lb|aCpS~~-Ĵ9k`7I3tiv{ųOb`_4*j]h?1*|zwRR~j˒y'](dp=G=ygi^dgf歶eg&f mfjbgs=np7llve/Btѯ컆Z ( 4Dakx1}S UkM\~nԛo/v 93e@ĝwǀވhu4- %Ĭ IXѳȽN@h`̧uӫ˃9G+8sd:U_q ywI}L;Ƥ>gzaړ$\9%hkvC Yi`2˵[3|rQ|[ tTwfө5{ӬspY 21D 'NڠbHG)s WE8 Ln u{ !JeY`h@"oR* r :$xNv&Ύ)~K4=|+{E" 47I5۫EםR8k^˽M:.wY3+^Jƕ0.4q˼OelnE> BU1&nKbnŲ@ԃiMt b\Dofލ3ͼ̽fnų;ynFX1-x=d>1qD_.L?rF/  0ϱ,XJʄ@R(~vb~?}lDK`D[:?wlk qd^!xe`VŒ?lHh5c "lOh1F}sh+#ԁ> FX/7#O w6.RMDEhqXm}\`鸞 lf>pPų t`"Z]9Rb̚\ۭߔn4fogpE&0'g r9vTGYf>yƪ05tsbc:S.@ޛj{Y!1 f1E_po}V!<&\aN)?+&z{M?q$"$֌ e0SDHIR6H@S%M@AI-f)Wz~c25B!dF;d&4M2'fėKˌ)k|;$F'W>؎q6\5=7Hݘc)=yhƇ_NpG.M16 I0uJD늣fEh)Qr:ی":(fi3Dh #L:jK$?DIIuD @<# )_;& $&I4f  )(f)6Yb45'B&DXG<8'pͺ%L]Y8EN%53gS0W]XkM8׆1R[ g~̛7E^`< Y.fӋ.(M5}7rV{׋njRw@wE8K7a h.CpnnpV!;1|7(s"fFWuoU|Fj 7|۪y/Kz-*N}k -䡋'ƛ5O .g3o=u| 3%]ͭ'׶OΌ.σ@_OLtD3;(aȵÊ5/V؟NR5$Qw>H}l_YkWg9>f9)rSr]Y,UEReyh:ߎEQڏu")b#eg.I|`gy$2Lpo&(d"sc̲x> x]o%^ׁJةȯ"ny~z^ΗuDnҠz B_J].˫wѲ1wT\.F~3_LGMh"~~W~r#zG.tycyׯz4L Ŕ$fڤY-k˜/bJ4/t<^z$deWgsLK 轨y;l"_ >o@{ r7%^-BԺ^)?z_]V[8:_ `_^^?Y1K5p, ǔ\y7$?R1à>zC}(z>{R4>LEo~ܗQ8~xiEȹY={v򬫴f(8#Aj,&C3w#v>ʳհ/|./כu~p\5#P7-'DpqM_-G}i,Lez?*Fyh,Do8i0 ^O~"Wz(=J!3oi%܄V .A_}$$a `v]9W ='QX#JS)9Bw]8UyE_2l^є FIW9W>cJI pVHp|VVIG`<_tzE[6)PHLv%T!AE|$dC)BnϩC# %|| m`6ϙP|\GՐ}z'⺶Y/֜zeK! Y8TASQ2ʛPxEМf:nI(R]q^wsZ${DV"{x4O5[9L!$û5M{bw^y8O1ĈDgR&ͫaW_-o`?<Qx!͉L]`2z!wI0yLSrVz>hv2pJSPq#Jz&KCjl6_Fw4W5'AXk|_qE=Flb߃J6;"1չ7w@)WbGz%w)|`u7qA$kMvnMD!X[k\wytо:TuA? ͭEY:ʶ1hCERPASʰ(AF% Y*84K`(!D#J10Iaw ã*Y~, v"LJ.N08!9~eǰ\rzU^ID^x /[_X/RI8ۋuH'+*2Rd(FID7]FEYκZ 4k &p[8W^ :S&P(+{bK‚Qh9O(;pA/nd u{ȲW5h(U)\;~5=rzWGA =/z737BI n@ZJ)t ΪSD2|{Q/ |ᯖ`'΄l1)>ll.勼Z#Lzu~Ep9WzQ.\zf2uvcI8b.6>bnIY_ws8[1#v^n.:  u4p$_h}/Ll:Ox?6ҳ~lMNǁ_P:{iPZ;3!'Xz/:Ʃ9]|ܲA a$Kȴ 'v|?IsR׿9;9{<Kp͛XDy'lX4FE ~Yoᴻ,2-'B7E9  yB\KWp ((ޘIBe D#FkV#Gs-LQ<'D!R%ّ)M56jOzEw)+0=.R`KzZQU0&z ?r_FM(1;."g]׿:X Y{|<.B`ܻiTą3q8 jk LSToEPCPW"MDXv.6ه`b휜^R^#\4 6T|αN N)tb:u;Sq uJH$Q94$Tj'w5du(eCx Jco`1?_% EN8YLd>rGuwԭQUL \!G,bI:f2EVq"gv'(kNeaS(b$~ؼQQO5 F!o>EG1 e +4d ң_c?]a*wGƺ8F!YI1E9 "D&PF94o[ZM>ޥ=gXZ ?g#{gY"JT>(.H]yRD§X>B/,5mJ2;յ{3]p2֤CL Z|KȮ_kR#kQVjQuPWI)C&T |;6mrA ϐt4Pg$%]MPktBLq (uW:k-*SJ:ˤ:+O6鬏apús4[g%UF묭CZ/ٚ!v؃N̫Cd:/G{ ZwuiTPpԅ=X>Zy dZҁ_+ZD hwb%}L%%K?.S@UiK ~ %!YҚAc;254#ÉK:q_j-N wxngmGW<]/<`ٗDJ"9Iw$mɦLU,ֽXe\MOԔ9VeX 쳌,ZNJ`F]ak[hees5Xlj9!z8 1Qh^DI E^$NCD8vm~?6tb (3uڱHV`H9պ+ˑjJr$B-HrB}j')q+sclj?˿~7م?6&FZ2DNNK -{*T)'4!年sU 9\N,k)O<Ȫk1ދvџ*1NÐ= 'h:YbOmV#ԭDLKuC?#OpOׇ_{E"gW;7!dٷ€Kќ_3:Up/4%IzF~g^lr5Ky` p}P&̪ܐoXltǶd Z'=ToAC_jd<-T@ltI /Б3,݄Ʈ` TA7 /)b`CXEN'$U%ZWULr=+rW{FokV8<@J4z5_~ݗa6}HM gXwJ4XAUb7BBHkQ]~zfJ]@a:1YՊɬ7ۏsʁ^T@>Nnwx@bCYkjA< `5;J=ŃH` )(] 2\`Ohp+65sR>RUus%AlpbFj;?Synն,R ƍ6iDg1rxNhkƊґm>G̜])[b.DqٍĖ]1^, fCNkE{ VćfOڔZ<~p/DjYoO&+{ 6D3db;;uf cƤaJ*^'2ƃK!U-cj)^EY3?H)2Q0N2/צTCTpķ;]YʍQTTw>0VA#,x` !d`k^YbJ+0Y./w"K080CAʫ#p jQ X<. T!QAO+Q(3FK3R-SL.&*#?qpIՇA<0?x0YO7 j=Se` OnbtD1'b]NBNDoh3HD e,CUT_]?PXXc;__:$2=MП)o~grd$w!Qg͛.MTE"T"$KDfHn5lZl婬j-3TA@!$GeY}B*-C] 2Gչc/Pq;P;NQ<Ϛ3hztZPf? rKOLxgk<`\=mPJJC.LbJKa:J`slփԏ 8DJm=U*vjHUGu ޺Z6l9b@gdIվjhE`v<s5i \{e 8"uE0;:LSc9b;k\(i'W2yG, ~˛=S(Dd,&˓qw{/%StDk*#V{$ ! FkGCYQ #DV?*4p=IɎKzZ3QLHJ$)R@]w3Ȑ(`p`spXpmPͣ)^ 0麷9@g? $ќ\h\m6w<2Zii1HW=JpՀBתVel2MYb-׷-OΊQwXTȚOmqZ}\_S[2~k5hvy{xss5byկD& ܀>[7ZjeE놈 Uƣ[Hn|۱zGy1ofnd P̈P# H<, c72j ڭ;^ALIݍd~cAF@%" 1($%Z%Q{^|.K&_~Dh̀n`@*"9Bj%@Y &B K5ȵsfjX)8G-EoB' Bg9Ċ6\Ck&%& ,ΉڊW]5l{N=8UFNUhf5#4VL`EK^ eL3dKZ L,Q']n"2&ӽ r;Nʘ@c4)2df(6=ÍqYđ͂NU"ǣײu7+`,ܴ\㽗ʨ1S֯gm.udאsk$(_ >]A8sㄓdj4sd1.{'ֹDD8 .S4V[cfgERf-,Se2 !29wmPeV*)&#rXIՃ{,*ך+9`+Rjy+ˬ^e}h:xuheɂ *bŴ'WUi8\  #=~|%(.; ;ω}vyEtX)PO.ZE6&SunL)2[͞bݝW:^ZCa&`cw8e9,ǜc +QRJMl _jdHsHh u ˲RC*ƴ'k c;/#QYQ L |QBbZ#`a]6J3\ SK91о9b'\@ a&!$UBk@ZYƷRNSH+m"%N[9%;%ͲEmY(blr 5PIhĚ ;ٴf6`h%&wVdxyyBӔsj>+ )TXjkηkܭ_TVT2;#$w?0F1=Π!۽e1zEzۉ(+=%TQ*ID,i;;%ELNd}dudIB1Lz\?Hgz"J4N`mٱe2*Y 7hvg#h~!XJI[9NW$U|싒ը#VPuNX'7 b 8Ij,YT!3d6}=!P36Z-ߎU*ƙeLQ\LddYvMm.vwu?#6cdB5λJcu#\#1/PLg&xiٝj_UKg?zu;qXZIb#SaU _2+$G^ұE*JQDs }u`bI_sp3?%xLw'\ю5.:QYҢ]2pIRiHN5xvtk/ӾRPy\ |]Yp1nX25& BǷ[_YoO=KNɧzv9{Zo/섪JE{'m:aX^&Y?-"?Qv˲! ?NVZJy_ŋ嶃*tEtә@!;uBҘI(FӢebk/lN/9Cڪ BGU}z\<5.`Tt}JgKS^?dBK貣%D-T=ʎG9ɩ>R+?$ˁC4/pBWZyuA’Rp/J J?OԻd~,7poV-g y u}d<y1bbrU2Kߧg՗_SϕOL:[,'ehob;*#rm=![XDg77iSS L^@̞#9 V>7˿@b&zzњmDna S#^@p$b(sGyq/,7Rs*<>G/iOf_CG݅ 5taA|P0@lh X-`f!+t~ {%,b4AԞ9{;u$}:-řQ[3p8fU4OCue12V^=UY7J359/~F=wChw$֦uV:Y}[X_IRdu+m%h`0H]s9h t0/LNfVYeIVR]TH䒀ɵ!yBoW徺o #Tm]bjSLi Xi4DIa=_,1:kIOsU͝N@^;8@Gޢ<6(D1Gt]$ჶVpل@Ѓf,4@$o๜kE@Gv}c1{3좵B<.8*c go.'J!ܟkݢɉh fQlPa+V%cԷvt˽T7jq`Oy B\`9)9:+Oⓗj5hհliңrª[JE4A2E; 6LCF1l8Vm)i9Nm| ?>BÜ]` w^e,q jhS {b{k`?5ꁛ|:gz.Sޥ tQ^7#\[ +=Y[5lr>ff*(MTA(!)(\Hh1Hzj\*p)祆iHPAZ2pF$6D5-"ݏtVr_@jDhh|tءp Ҟ ;а0m*Bוěvs:[UAܠC&PE1Y%ɁUI%,vGgIXĬOaeM6ҪHUx4E"$ue.E 4m4" Q,g7:8.,VˣVd*~3_1Dei9n+Ϻpi…C؎,S͆5.zฯ􍴩1ΏY-fǩqo⟳I۲yvZ\g$lv(w!\6׮#Şsr %̍+0_O)@]ǑX>CL+Q&<&uMµ(aM a$Wl\Z膶lv?w[.FfK:dۄ@&qK>a2&Z;.ml0-!R+.k.0Z嵧eG/HWG6? kZF1OSރ>b e.=W7&b?JoAkxS۟즘Onij\^ ?ݮ7Y:':Yyf(O4ߨЊC"K/V OTct:?M5}/F-~?(;lF˿ݪ'rR )1ÑyU)"xtag;[MzֳP3aul N\H!W{"`l;R1# W kU_]b"qoE|>Sg^jFԯ ;2V!䀸RRDk0W0o4߯3?1E d $[[yxӼy_#^N|5 Ovf|p|$'\#, pt߯ ^QUq6?)ee"Z־#$hۡ, (]jmю01'p Öܢ#3Hh0z!*,%Q].#A"OښxhB߀ڛ)J>sۆwB:X*4 DK[=G3dv%\$ZeY$‰E#v0?B)l2)qViǴ(]u9K9v&֠ s̃Ih2-;u>a79*d(ڹesJ tيLym E/pݸl+zKu6NMF[:wq)7bi;s;q!* {S'P"]Si~kFihUac'3bJ"z̲N;I1MxGB0C Of~/fe淵]&J X fn9^Kʽ =?[VF *nFw>#"BBUD2^Dqtfʀgt( sI(S$S<'MܧYP%XU"ȉxN [Eq0^;#g,JȄIG($Q)iZ[VK%)3[wYӝ1O2c}DԶ$sQxb6MyZ[H .ʪ; ! zJYdh=ԁv*@ ζzhwZ%է}%D?8oT^1d>('C8& vY\ PdC猀D>n+s- #{vھ|]NGP"jq+9ZYi0j A8'de5` ɝt0]"AR'a$1vWIRdponnǏohǻ0Q^ $)( (7A.z&8{\TS\+1ZLaU|TC`D*!)Dw!锜Iň-Su [*m+pɢm +E-;GYH}C:AYL~5Zċz)@.Gs[K|7a,=,>Exyoٽ)Ƙ#cFj rjY SV@,^6=F_O \xtx} @U}ȡ38lRKss-v(WZ77‰vR)7e1k8jͧEIdx&cgW7+|wŬ ii 72_>ۉ׫YmBwbo{h'6hswzW_C9~/~aA˺eI/ȁ ~Rv|KQh9*Z;~O5QJP{9wEDq&FP{oQzMAdk;0帇S'Jtp@gj5BgsW4;E}#}V8Ь06d"@Fz8MMڦƣxwP;ہc3Fdevp/;p%\P(tv)@8_!P|"&9Ʈei'}gnG(G#!BgSJC0JSHOJSdXt.-"Ŷ-mAsPgZkH}/(!o7!,!&_W(}A"ȮC 9> ppO U?^K!N(,M9$9McƝV42ؾI&E=g_@'U8*`QQZk8^F;E@ƸH.9b%f gGf0 {PW&\@hӮؔМKALW15h !yfcvEc߉*fN ugϳa֑@6 /tU412~+TMt@ +FW%9 h|{ ϪS4s+?tӪ4KwZv)Ak:M #>3ax04B"VXڐԊǰQl4#8)N!Bh&PB4i!s6[VMz>?LlUIc#37ѶBJlc k);ݲѦoJev[T ζl}}2]ƇLڹbgF iev׺p$S̮Aڒl.am:IK0glJzJ(.ZM]n^{ 6b J &L ь-0|nBm3~LA2V#h!jIn3 ^Ȧ9%'8但|W0RﴓrvľqX>FMϯy6/YjK,5m÷)%ii10d"+P,8eN_]x >TFD(!$BG(UigZm8mZoe"ֱ*B`4,bFY2G@>dqOB8IY u0rdsgZ|e:Dْ:rq!JNA iӠcp:Lgqo)E|=k`P.wCBfIAx'.fDS>BUFU̝4~] 7>b4πBKp(PvSPl(#,X!H:Jpb# NiLǺv9fR=OlU]TwT,%4?Sɉu:E %H$Wa<)2DR?Ls38Y\(R :ϐs!?H 4 oYT̲H>- &vDQԴ(LgYhҝe;O;ftP&"Y`ZV†'1bh%6NXE賛`2?5pb(vN^Jik <`d<^h:F8SaAoWubil"b|R!o?(sZTTԭ7r0>S?4iU$n4$AɲmYNVc~_i V;k)z52I'dKG;霰*7@  w+hIs%/tAJkߴ+Vm26aV2t'Z=.;< hjB9djtBVϩ.VNG("f~*wkf[#BI2zCw IWͤ)!9RNkΎQΥ.ʌ pFkѢ7w㏅*ȝ)j8)6'7ed4.ZVI*=]% DxmH z2d+gh.Ewp^Apv"Z\UQe0g3i]g htҸ32B<--6L'k^ET"g}1% ry e"/dXSYc[tSNɕ]cPzFՃO p{иxNHK̐J,qO&2ࢡ aofzx bϾ&.}n)c*,2S n->-'851 b x_VFZG2$2 dIm宼_l/w+dڀt,Nn) x1Yd\`p6-#KX;2pl3cP$##>[^Xv#v5B(0JFtuT+ R%( j2а)BsK-f$dLU <--5L'&N%h/j dfSȊ_r~j3&(pD&1J 1$Jp5N$DipB2:T!w#⍉[i[@+@G`:rjɶ[R-=mq%S䆉T -E~|빑N9Ե>c\Ϟt] R4-Lc~޲߃ѹa*N`M[ @y}<ђV&):C3lv@ilZ쳈J< '4oډGd)V@igtQQ,ٴǏ%F ht$ @V۱ەdJHS|*=#DjиJExSFup;kgSҘ~96{Q#_4eMS;t]@Qm&׷*`P!Wt ľFhÍkYx'_t03ȑ?꽈FF=_[X@tF#hʆ^Œ{M"1i;sfn2ME|xY&TˑoA-u[p{7^Jۖf[ԛwɭXg(~c13fՅZiH3D?<B2J hR6ǰfy֙_n30swAK ) ,A5+tЙV6qKDe-fFO4$VkvZ>d#:˙SAMͩ"ޖ( 4,DC@Rˌ̵ͪ < gZszѫ Sz@*pLp+v`;߇?AQЇ "D.'hAc2] B.q܊ʜ1|[H!~fXdnfo9t]%=P0""qchL:].ߩD+Rw4VyڸvV=pZ}W^ˠuK-?ZXQ (dz.B~f![8;#U tb׋N%3ghoKJ5.ߍ0TaRn}k8fhcAUx-=Y5txT{z3%5/ϙ jJj;S3J="6@aL奵 Hn[|yYrm__:j&sT@ XiZ\ʖ jAc  .dB|~˖2ex>-18"ZT`<$-C")ӉhNNJ~bkv'e|!$"-瑩U&$⛟Aّp/#n1մ{;\ƈdna~#uloq\%nʈ>b%=9BG.;4u>GgjaAM6[5B3 * ruPף+ـ]צEtZcڮcxSd}%U%] Mcpp# nZS`$ 鄁@+Yi%%)*Q9hέV$rN~!jh*ܺ)ߦŷ$`r8hrMNhPnUBT&;]WʒRJ~=39'!w4:7 .tKn.۠ ~nhs `VTۏguJIřKn$tfnHտde`0&nI,M)uV'fW!`+aU4X$V_X} |e ջK?p@k'uo%U_zFQUtׄip̂֟byg[ٟ]}YyNITwX@j Vn DQ3ùs`^܊/kRU{q-' F6?V;vW$D,$ h\,=bl!&`pzhfzՍg',|)Áu/Q2U. RHo`w$߸t7Ͱg5XLdm$I 6~F awAiE )֠}#IJ,,+KUE.4UT3[!/ P.d]1)%Lqd1hwp\0*D%UrZ._4_\<Ϗqt7(ytY[@G/z>ixaм=r(dtZ|zgZ$S.ECFLMzq:Y@ؿ_nﶤs~:äO4aV|ؔɾOVv9FxC|F$ )bbXV(89rhT.%rJ~6U࿾K^H\ hHkcԈ~/nv_#?*$)UttNC9VՓ:/6J! tGT2G4TY;!ZK.A{^zv8@3 rZ̜0d+خǶqO  >n0j=F i m.ZV4q(&Gf|+|(/&r(Y4cA{VЂSm .mJ5$8zl$w47蓛9%P|z;)֋~&Ɩ} 'DkǗ߹ܼ8 ,Eʅia`ZWok4 @I^yz4i(nE# kW{^#6l-b(PŰ<*LJxTyeD;|q{`ahчkg뢲}K\,׼rCͿՋ&/ LM wmp@6Jq~ !~~귟z+Lgg'@50މ̅m뮗yG3jk{*ONTO]M垁suZkg@]/A]Kj50}zM.tA^.eWL|ߒ ~=+ԊOxWBk.\Uoej*!bEUE~=yt |[I+1Bt1zwam{CRЃć3.& gvFȾ,ܶn1"/7{8l~P9)djMe~qnUyMVYc]%Z ꇾg7mw RW^IXJ-#pʭ˿#yc[O#m[mF>4-.W]xȶ#_v!֌I0|6M!]CN)9]cx&)X#$woh[ A;ω s!:< iNE5ɰHa:Rtn4)3Ba^vw0)uC |U0 oWJTMt$WhUmnFp? -%!Pv(ywRRw6eɜW=mt0Z^#nq4RnR䎁["0KY(\-)ZΥ]Ee5Rpō,\UU{.4JK.G3;N1A-5|L2De`݅N%C4 n( &,:VIR"% ȃawjwa[hh%{|@KR'#g؄sdt\W]%ĀR&?%&`E|)7YKj`u ŀvAXѧ$87ʷLk&׹T2gX6HJ SIȓVu菳m2PTPF qD&$p興ƽHŅm\(HLtuY w#9J^Wґ%9L&eĊK (<\h.j^?ݙM`!RF:J1Bz]8-"#X;`$,KQP Kq'6tYmnjC >Ӻkm>/:50ރ*cL5Ư&HD>1AlYst aV#ނA*p1Wq@s"u{a:rCZoZV Ks%I s9Z"AM U )o=;ELKR 3I œȠFJ,S <x*厶ZqG[V(znqh0׈+gir,9'56K)yJ K9tnܫk`~d°$g|+ ZoE15yCv =f9#x_ 8pcМ*Y.<ZަS[K]%2"!EV(WZ,D50"/9FxBQYp5ޞS'q-,JBiM4H9T}x>h6p*ŮӖA3yp_ Q ~7nC/SۿQVa9!/$~rn+};ܦx;MhzfKeRJ=?n',XT͵75  {7O}) 7A94*TT$m58GUÉj~SR@7a [\-R1=Kٰή)QDٽ%8\hOGB@8򴸞~mWIܜ K.O0=|mk}*Vnmۍ`ip%ݩ{W^q^KΧc7)YO 03׽:*C3=g)A6g'9ֆ)x4 u׽woo}4I-fNAd6b`ƽ > 6a*A^iEkK3{6Giq\wyTh:,ܺJb?x7 qHAP)2=FGSDr}d8a8)," jwbWM Vk4zx (;wBpuխ^\5/PMhB*A(9J֚v8i'1\Owy|qy83 Iwʣ]NRR`8DܙV_ZFl=s Ik_!]\!أ|\%fG]Mlo )AMhO>GMX8AAr<2hTJ.vGu?W * ԉ 5`)%Jx0^ iI?[VwA*lr 88p +(:#MH*U9WZn1}ݞ_x/?誼Ql{=#`pT;]c@ ;xKБ.FYyYDX ј >H#42j!6D*o UU[ϗLdeVҿJ.M gnK˝,,%ZOY0Q%BX`4`F$!= 6# o |t8%/b㗊tˁEpsvqj:8G7fvuOUI<<sf=[!J)6YDLrX2} ZT,%( l̥mOε:6ߪyc FViBԫɛ4cPc`#;8xKxzd5|KNM"17>`eo D 0.--];ᙚQM[1sY;gQEӶ״1~b<|qwT u1`=җtm&gV(12X3`AJLnT'hP{&qa%.$ih(c 0eLD,T [.:azJICd'*y"x =I>*J1 v/x ю0L<} x7J$ǡʁJ$Spb:Pd0WoHKSeYz>0mAxj2{%#-#\'W+gW;Re.9}9?~gw.H/q?Y,. @?QsΎO,N(h/G90_r,9hہ.rۧ6دصl9xӦsRHMh1pg7y4ya} F8eUd̀Q<ϟJ*0PjJC*Ƃ.+)6ƾKihmbĞLJ'oz4 ݘ+٦I%*e˰/%p_V?x`M4O[i Z W#͙7[jNpؐBZ<#8E .:g.8'PpV>g іrJ<ϷEu"Q-2lB&Q'(7@ CXaxpҀs2<'QFWBSz0M=v̭ac jv q-F PIS*p@zE DGBX7t1Şi0Z-VBm?_ƔdB|EѶg;+{ ϝ&U`M ++AaR$ 3јK\FbĠf$Qn.EE"yV*f\*:ft3ͅ%9 @z 5i.%5Yʤ8";Ý穪bXhzseb ZѭnXWࠢ?Xosy1Lcx>ua<ʰn*y]֠ݪO>z5LZ.:~Pnp-Zy j_)7OH)8LУ#V{*lA[W6A6VFxٸmͭln~#YsgsHtަJ'y Bݧbt򇳹Nj,)n N"8:Sb?k芵o'v,瑧Ƹbko*ƯXcm3/k{ؒ{t'罕kXh 鴂|28>'vyL{s!WG'l0} C-`R9|A>`Ļ 9qFɬLǰK+aI{b6JXc+KQ+yӨ/@SvWa[ /nsL ڹ*l3 n9T 0!}M^ݓvЋV@=IhM_=ǠGXOAJyӇ %8t^/$[[' )ѻ|]yq31o$X7 7c]ԋx?Y`~{osYcs9=m7&|eЋ?p~UB{\wƼ?j;_޽wzFn_d﾿ 8;%;ݸW~%oOZNWvs =@&qJV_z"ٓ7lIM0b#OycqEW򶒷gdkLRQ fK(h6C uNGo+1sݑuit؞Fit؞;lZ]J~P U 1jz3hj@J](C ߀ ^&CaX}TgH7cQg%4z}QhOr@ 5JF U}Lӕn Ն>{_ ^QMV/.i2 aƭ#4Ɲ#΂owi~#O;ՌxIYwZeShW<ݔ{bJ]) f5ꋭ1߃D@>;pKCMѶJLB 3Xԃ8y:Pz%%+660D.4V0i%u[U:Baen~\/3k!d.[ %M\HZCP1ΘQaҘTQ8S#jiv/BdJPIA7"-hB<7d[[$Tu}n;w7 g^ MKo~]u0cbڋk\=ToiZ ,9 zN6vMҔLVo;:j%JN$Kڞ\X\ cLTL||M?~#zMgX>y+~ o/%z3w`g!-.M+NdGOdaٗY٣ﰹHԫg3&ac]&HuUx]Rvbb!%pjnWʼPbP!ʔլXVCg0 QJ.2~dթ9=m'D!v)ah~=]mz8쩵Zͽ굨 "[hgr-M&Vj`66Z&VW"s+(^/\P9Tmn]U[ե3} =#F჉٬)l *v۵N.FK1z+U^rQk ꚂOu~AlXIڈ'YMg-VcidͤƤ$]֚hR#)`uaOaxV;bb6;>R𼛒M4Qnفnп#F*%Vuȁ ?o!- 90Xl3BOx a4XX22wݾxmK^~ѽryqz Aoud?Nd|w}_pXs|`w5[O?l{kQ>]65 D,J%6P2˭ɬv6)&[Qb puR(fsxP.*~"0_1JbrPdMtԡCmiZCK K<ّ`Q{PТ5u %mec# I5r~_8gz#6\oU}0.G/I|" ^~wtXa`;`j0S=M\Q6vToZs $Į;Gd;m\чV)cBܫN=@!eo6n!Ȥ4e'ÒkB?+*7 \`]VZm( ߗ{jZKdͽ2 tsYbb" .Ȯ-4RzfmT`FNK鼲ysE}oۓ, WA⫵.Ġ`N)>50RZ#!bk) O)H юH?yj~dpw:3hT+kC%TRP6)Ж L <{{:k\@܆¾L"K5ࠐ#DB_PGaF=@u H'Wc]B.RnޢeFmVGR}, }ცjDfπrÉ,p:T= @P+׵{o*;:H8?@jr7gI|tlJy v CWw+ܱ0GS&1S=ZgD^Fʡr7ath9Un>J/C&+T6/i$>t'>#rﭷCۇ~Ly^[jW LÃJɊhqx2mؚ=!n$:'|֩znU$내ɱZ\vY\wۀCl0ښJ_-"nAd-} [2"3'ܷH v̓Y$4C^KN]$w^%Z5g 0ou?Z*n&Twg7{Z~綺~sؔEaZ~C`!BշcLCY +5G` 46A9 =1 ,#[y5jF 1.Y\{{'G('_Iʌ3@Y {rwf=x\K |/զCV30+Aו71V. |4,^0Ю,JV>+>Mfi]9Gh^ ;Φ6z>hA9Mqv+oGۿ8+qXѿutyM0 ܔ'Gn8:R]0QC%I(iA#hNAWZpHwJ$S"}2-"}Ga.s%ȭz!yhU9jtIه \k.d*쨿L:9c'3jWsD`=J)*?ai.uVB>mzZFI">Ūٕ^iʏg c0jbaTfkpF G/8if;26aTkX^!F#ԻmJW*=oJp*|ωJ=hˈs6O8p+^Ӈ.2U@iCb5TV'j9MOX8IǀRDJGyRClU%ܛˮfcq$P&dg\KAT5t(J,cM QG/8w׸H}}5<)>}{ן[bD{zHD~ٷ {f7vl@Ae u#Z*:,KݟJW(c3ŋ=UXO zOq*쳋q<%u%stNFs$l(FuJ}L|*\zQzGH6"?cw/6j? mw=?o~1-۩;)M>v5jd_HBGu$=:׊&n_՛.;wLw53DfDC[$ԝi}%` cl%SRSsG29 (4wY#v2UΑ9. 1,ptb^Q·z|-ۛh1h1m68.k'HA7zT`oO A zJ܌y2'hv-ipn,3{(2V Nj%{FLkz,LSAJ5#N!MAS5k0>l s()غNznU}i4Kfj3攸Ft&a(uʟm Ci\҃y:LpQ.r8s.޸@ " %fb4N%Ѧ&c#nYwdLj֕ފVIeP9:a/ͺC3g0pS@8ԥac̓f ՍmNƆACUYХ!yy H!Oq=0eI8]n8Z;==?>}!јWxv-)gI!$&qH6RT!9xsSR}sPo(EB#I/R W"e;MSECБS-N9Kpяx?*ͨ6p-ȥ&mc+:!t&^;`AHij=eEБ\dLsʊ`{b261 &h}fE<Lh ;$opf \۫{"5M_2>)%C8֎k %(՝*Ds1Bm}5(R5p/^9.&A 6 "lq;гo8xl‹Ahڐ50Bmq*8qFZ-{u4^c)ZOTeSA),% X@l_jJ@Te&f0JH)ɑ̖/X, H ;Z_[w͉zu8~>[Dq8^8˘ ネy:xֺv0;*fG$[M(6##a_%?v3(Q{bwlɮG@A's햲JMsƬ OTȉA'յjP㈔/5}NRV J"^iYj%!gj?t>9/xy MK!?w. Əo1{{hߗ+2j^/p3o$2'{?CzÇׯ^]Q0~YtHfW~K?}(?׃^zʇo{+[qG?/%,oh?i c7 gvAg8"[]WrC+_|4y% NR߃`Cy7ͪ~_nLֱz}eb҃t/^"W"H˝>c|ⶶo^xٹtf=xw*,JӛOX$MKS{<\,&R\P1drw[/E$QRRУ.+h*Èl|ٌ!Ǩyz'})"!}eRͥGtC\%;ځmkbxr,+LIs54_Sh09:Zc+5߲O>+e4 3Jgj"Ѹ0s5ڕ{[y8u},_#+F׭- ęH#H0[OӜf5n][;^wi~]F|N4] kupR>xWq%y=~#yD7ꪜ!w:eַ}sqR&$7]R1OwJt&LJF8s'k (`m/]p'˸]+A隧Q]h"Jy Rp:j[$n-% '\BЋ7w(. DVc%@G}kiD]\I1i*nrrbҥ%#OwWK+[$\Uɣ!Д&/MJ2a ~zНwKش+-`$׵!ay=6@fYQdX=pͿNשI]sT=K#ۏ>^ϔ߮[lJX;UZ no^)EfQa:ugNJ[gZzHr_1H;$4069,0}!۲ve(UeIY%Y-U(3 ?2PcPԔU:iΝO1҇=}L˰lʖgDaH M'C6CַI]tuOBi[uݙR0YEp9OQmfb簉^Ḧ́q 4=@v6TVWe,k"o!\SrNA{%ӓ2wFGo o5+TVr#{!eYyRTlUTMٷ`|G=Oll{g_9ds!hđduXMcK0 g:w;*CMLXq XJ(0$ }=Hjbjr0zg*^c~&eA-e+&W VdI㨖m2;R}j3rSQ"C-fŞ|^T^m?>?UC4+c;Dk!{#)ٱm̹Ť阬5(x'S%5r4&WYZ)ՈӏQ7gkgM| 2Kl-}{h5/„ +^tB#3}f&ꭊJQT+b.y5w!^k{bڽ|vz9!B0 _|,k 5dOwX=`x>s5y$k[rA2dZ,Pd2ж8yc8̠9~>=.7>r"7(*c,ڒ?Po4ڳ/ `z̳UG7y&#s=ݶj>}ȫ}=cD.)ßr>Z^}H/5{ӮjtyK;&-?{dn#";8n2OuhX";v`Ь}.sBw6ӭ]Krmgs'Pg$~㼯-tCɑg7>(Ĉŀx"DY =pz[zs/b@7ftx.?bwz̳#rf԰HA㾺#dN1q)h>l朠][gmz`_y65yjm{B→# pD'Z7ޏqPX\.L(8u;qa%Y>AW ǚ-b,nѷ`W=*@.\okG^}qIo _2_}$z#ד2vbPmtM=_}8/S._)7揗}},7IeiQygbU!H"(6n%~}pډ5V5v}Ѕ$%MgtpGc8p CVpqWz-,^'px6 (+U,ŢyJ-^ H'7cC%ݬ+22Ox^I`YR6*3vgw\-n\CXOt^ԍN c2/ ׏h@.9@'GL-^[b^o\p඘@-n\Z5]yV^yu:@}B@; avŜU@-3 Zݨ|#+<,jl25 RɩfO$Ѣ3 ā `*H$ k ڂ/Yڇ^e b :٤mӈ]תkĩNWaOҍ\l6DIU%$5Yw!Da1,(ϡ9fcrzFO<ͮp ʂBIYҪ`p䕵(֌`s"PrtDrVRW cqΒ#vyddd5ĬД"}d6ބhbl aLZVxB+G"* 54B7SFΐԆeKؚ`&1brSa}*&{4.45(h[;ܨПJcoJW悂˥ҲH Sڝ`=e~̓xX9 5EĠ\W}k,h9V VٕT?yϸQy^#wd-_ָGn7-rМvc{==R{UƤ󸙥q;kM]O@؛v!*m>N)Uͬ! C&SV +ڂkr1\R c 5;R`CYRFaf#*}) 5'4K16%(vz\2 `л"{ Y˛&x5Kqb;=˛.чLrQֿQ5 g?n8[rQ5b0 %S,0ř(!*bW*R LAKTL̵$i-F5xMeӄ(2_ ;4dT[;P C[ULru`UG zP,TƸ~Kw*lVF95ߙki$Xg{~6Eͳ6]! e#7{wwmBn?Tp *::Ǽ~Qt|}@o}7$+p8.X|FSl ,QN,V58TTqLzѹ*g=1W}/t]wcݥǀªLoo9ɲ(J >HMv^ٍbYFD.4OP}72Xmʆ Dh"!M%DV'#nmx 3~+%bo=sd~V K b1G ]}ym׋7bbם 'P^E1<4^ C6 E&A`Qʮ4VMC˙Wcc)5#j|MͅZ ,^}=hxYȚ?l+Jƍ$c޻%lˣ T8Z~嗷VY x귿xZ.ǏU<ʊ{m_[G}PGAVtO߽ˇ!wTtRW r Z4ʬ <>i`~ӛ>9/.+p=joܾn:nW0B@CQYo=䛂`.-g]C^y6Z: q} %ġ.Т⼓cOڀ7枟6ϰ"z_@더6no a.bau;\|z]@I[ۼ)sʇTuՇ|fgwKۓ0{&е(q ?O>1 {e(^Hl%;0xǸI*<*km/wtqe&n rfhȪdI )۴EŤ(Jt5n,>s~3g bMڒGVG^PvD) C kPe-h /ThxgJ].CZuW?Gkd'ӧ'>|Ek@j+vdԶն6{,|[PԩTB {͋RfC!D=-Rin*7~$&D=cp/SP )ў%֧x׸Ҋ Bb\8]~>l—j 7JìGCPth~yS'ƈJۍ:d2*CQ,LX t=:dA))X8ȢbZ.*\0*t9&pL4V$J㥨Xb>uB(K_ ^RՐpJlH  5u,@ڂiB |Q G9(p pZD(Ft# M!+2nfQ QKgg"T$ Lb1MZrǴ4O d:ӊ9)H Z:+R+fw⩲:Y Vk-Db ѭ'!3C;h+4$}_pN~r7~""ett@jnZӾxw Id^(XBwbȒ: ![[b\"RB*D[ǵ(W '6rY.τQFSehCmF_S?/E-:<6]D69i+5 i<\Z|fLq+ (J0.#VKJ ks*vܱrf-iFaM@E$P U#5 SڰR4 (401R1OYJ4EfCir5P\Z&[jg|`x؟gD^d)U>jrB ]Xxȝl ;& G:X8ZWOP9W; loVgzr9ej<('Iޥ3k~@Cw{4Q3whմ˗hq:_DUɿfo;|l9B$\kߎͅzPcD,Y%no:r~j??ғ7E22 uzn1~12y>/:LWlbպʪ~^[kT:wЩU}ir4D#P\kjT4?4Xiź(I40.>k}H'[Ziۻ0oJ[xܦE.(KYb3AId2dmKX55d((. 'd<>yFmYj .NiL|:rzB4eSNyHhR]e}b&Jݚ5^V1kqA[j>80#D;4phҗZ`}cFM1+# a$CC gZkTCfhVuRU쑻)mp0b]Ly߈Eh 9Cf{ocs:ᤢ&D9(q`G-2M )')hh}2] i!}{O/ւm5h^0 Mf9W!R/3TsI)cѿٚ"Lxfy>߿Rx4oGs*ptR5Vz/DQV4{g]/pمx!?oMBI/7 &r2qB eN{5$dZLRr\'噶|k͛B9%>-(Rئ@EjrbM]po\uUYGtۯp5w6B}X{kD@|‡3/"aq RuI8;(1E%vr8|)%`$l'a8bO b~ZvΏtB;SifZ!M+VUʚXkj! 8"-½nT > c#.he{GMl!#qƀL&;db E9d3.4 (vnL0 1葯xW<Hܛ{jz>=9|__OǷIOB.rj.qw{'Kt~{{zJd,V.$wwԼOϧ/IZr|g&Ȕ@Cq;JJ3ɥ P,B'pR3.e &sZR2Os-uNiz-g Ħ",rJsbӅpnJ976sv1{̚v}VXgRwXc+۱v ]D J}$$^ھ qm&bI\ JBonZ)nyջRVudWIAOn//ϧdv3[䏙e.%P- LbI səG2Q\ xѡIνH#9zehft&SJ; ĠC=z ,3pBM/PqO)}MrPC'k>N{{<Y12P_ ?yz}q-y=r$[=ZKdkkN;|t,[T_}/7|~TWW%Ay]ʣ$ uaSx J;ɤ5yd'@,IRĂR4E"}x6.߾8ϸٞv8"GP 8c*IMN4`$IDOfe7 8Ћ^AT!&aP("jxQëoEb^W3S8)dzL߸r24 ߻~ܥw5E*0h9٣&o_|L/.ًw([N$ gIS,Gysb(ݜk'8Oe(IezkGۭ7$'W"&{O /|VfOzwq5!n~6u/p/'v A4ݚju!۬ǡڊ"A" ^P22L5_TLxf!?V2OMO}|:[mzVHsw'4p%rBtz#CEո]@qThK؛-Ar`"k¹ui1Fƣ) +R*On>}x@3cmדJ32#v=|/:uZFV|5SpAXa}}0 opѥ|Rexd9]S4 '\ZUTg3T-1F iJqVpw@N|H  1ԩ9f_Z2-jV(j-7 E&$՛TUWs@1Dq'l&M̀{iOϭ̨ϥ4if׍Wsׁ:殃oǰ2 ԧ!Kol/=CeBզrdp\Ίh0@eD֝"z`X#«5"Θrr=9$V2RBYh_`F)mN=;TNC {ڍ{־A{p۸=i_¬3ov^{}%8免[D%ݵ-C hDLu O+xHVSbai#d_ ;rf2{(#)khC%*W*iYy+ݪW:[eCJ!v++qRj@de*CO$LNJXa)VX1@[FCR5P#v TTVw jaznKxg5ȹe˸WH*cP3/o@Ҋza3!f]l /0渱2xԙ[ueЕ}O]TrxHL>x  iIΰj#s`Q3un:5cϯfi8AsB@W5wG܅ixpфu$L]-aAl e1tMhqu8c0u)M$XH|\/ǧc㔠6˒y'.1J@bfA*_kxezbifDi@ s)3R1nޓlW_f{e׾XZt^ғVf:ji`'qϩ{\pႍc(.Ug?Β|2-V8QeN.$f Iu.F̉zSH&LU:ry_H}gT~pY(,uzu,Lg$<#(HD*_: /qhGƝn'E绱2ܰۍ~S6B)ѧneS n@[1!m@W/PL$Cc ԓ5^H(O@)xʹ%AED1X]&DKV2%9S'/3{fE"TōɅf+)8'n_Nh.F!ݎ.M.{8ө!(O AyjSCPujȢđRD$2X@~*r܀D88)bq*O|7Gl*/3Fa1P vKSh6`A0gOK2霗&`Tܜ=x$Uj=xR` -VAS<\/4 frxm>B|ӥ|m >=C\]2sIJ/2q];%-u[i껜V}^7qkA rUX&re'@Tn01D!&{#97FT3tre"A@E>3GEp|;(%v[pv|ǻwLfR^A^d .rpw轳^@`rkkG&#c7ԇ<ڎx($ o{ Mۯ펯 Gf&t* 0Ƃ3k= y,`NU4JUY9. [w#;S= 8Q|~INKz@|py'hqhq M,%).U0Bjg \knpm Ϳ$a4 Ij,A~ycIhQLDdLRQ6"?޾BId<5Z=7{™Nΐm-V DBŐU fGEpXF\<xro$m4'dV 'W0Y LZNAk,8EFJ0DCއIHu$V&~u_K6Qch6"#a$I r%XqP|@[sPt20' -rAƪf)} IXG,cCr<6.Z@ …9pul[L~sNf%6bPrp̓,$.&@OQ'E-~T y@1C`+@khQ$cc% ԩnokCS" sCʹ|A)pFd*86 ^(lf1lFM 4gZZEu`|"ktLF"0Ĥ@)4 bH+`p2,_RKK-Q.DUK-k'7,NlF/ΚfExyʔÞ2$7Ue\Q-4yM3NgǗ|R<xkgܛuKqXy%HÞSFgpU&SB)!r,I&=,2e HAD3 {p‰$-u"bw?ūʪ˜zI쾙@4 /~ƹu}ϻw> u&fB<3s84|ő Sj6/zڙ 4OP,Clj1A'|;Hu(dÀ2PNma$'n)~vn Եd1s]jHeJ>EW5 Hz6`VFWB*?xkr&%9#˓y[S-)N߾z=}%!(3_qzAzŨxȋwe{IT1=P :xrz w /-A=];}Vho¡k+ToǗ^n#٭}zl˟_v|Zk `3SQ͹h/fusfɅ-fŢ s1@/`ɻM6)q1sdl sW1sWwo_˖OsW8|# ۛ` ggg6!7zx"諒s(1(Bf½kmxNWJڤf"r uǽI1pLB‰6+Q%mhhvw[hg;$ }Tg8V{8v߂LEo qEq )"-q2t,IkS߬Qb8YC__~:BoaYzťmVT޶Zl:WFLH uzNQaauz'議VGj)mxjfS'MTZ͟Cĵд$3dنm.Z^m]47q2}htH =#R*`LkӤ[QIU/o>N7O,ЖO'FdU+*cHc=CcD=T9Œ$x ز f7Tb%bAcuIQPq#0>tóÈg`vfLsf"r8 osQFFk5ص8[mɹ=5|Ft5YPsQtj`AxUj$&)e4[Ϣ}/.']2!]V!n߳Qd2!@\_ u]vˮrvU.fM>rux,e4#) ۏD"Χ|\#"Zݼ-̿mC7zlbhv璂aN* 2.Y f/- ;wiŸ92ARǖ +C."Rh3f*L4`;DPk67a&A"56fıBQF>LH Paxz("By؈I}RkOjnZ|Zߺ}`0Nvl8BJ&R:!pGe)+y[`1-̚1=2ZhwI$h3igp1,jjR"[M8.t L\Է{+ 29ei,<Dqi2\XccH4D|HSGJYpA%)Af0B<ki !rtc1%4Y 4M=*@ba̡ Δ`FooH(xL.J,' )B$pˉ+ K YKb+YAہ9a~VPch6`&b$dZ~:TQu` 7XwAX]E`O(1X)5KcBz0q^?y+Ix`,y֭M5O+(5;XlEζFTbL+V& 2ǭ?9ߘ335]/Zc)G/0Q#ѩ&`8TIއhtI 20-r`e"ك2-h&O94@<z!̏jj#'Jv t>i IxۤRPCrI^ ?|6i% Lߥ[Úas ոNuLôN%0fEK̞t&FύFg2'<٥fp + qq(vn}q؜6QMTrDDi+B [' % eJEIuQpbRiHvf5lݩZK;Ͻnb_cޗ>oaYھtkWɛi!%C@.{nPDjxZT3TtWa'M@ zKĉH|Cȝ,f<}}@skhc SCB}lIϛePcN! cHS8P S|朳i< 'E.S{U>o/>Rzx`sBᙫ1͚}Q*̇O_֧,c%T%7ȀBwhּcA +V%usMB8C άMr %ǁ6DHr<Uuis C.-7{nRi\P@sk^X,!T2rQS&V-S!Z_g%S] ^Tہ\ z0?{6e)pM7lmm:x*`,%I&}/HɢdvH >lጜwV+ߘ0׮}+ZxZ< lڅ_,>o^OjLf$Ӊ%ts#QlVӿX:*ܤz98A<Bzqh ?ey`%j^dĚzV6}-ͳꏸ&N}9+9u VFs Եuo WgWgQHFYZǃ#([30Hr?GʆT|bxwB۾f # 3qg( ##8s2hefB08'R:(eJ0<-Ys1U:ޕrj3)#\̒0k8°LJ<W4F1-BCA^<V`ty\7]ˣ=aԶ~3=bL ,8쮘.S:keKZ#`7`G7JBЛoFJѨ *F`H"4Pv/sW1X'g 5gwnQt&Ork{Yc56Yc56f 1`##rsD % l?UN9R_(K$22,2?3/rP s/)_$7\VG_ $9`21npj8’'a FT&)L8ȉd= vԕG!71F;GA-.ܙb,0y~ 0Ia<&g/S]γhSmyW5)r|UO ZLkpxZ^wOny^68:Q M2nE!؅hgwfr1[_c+\}qʅPq hX;I> DEG.'˳ISme8Rlʦj-d@FCH[dObPΒxZvwa:NMqś1e&b4Liy煓j vb'N[цȒ\0\qIHIJ&&gxg+ZG)DUq>(5zn2[a B>&%JtoiXa1֧6wq]gnV5ˣڭ!:SUW !T)O45`BiaSUFiʭ9JR(JbBSSXvChRKt% XV8mNꚙ~Yu8c$ņp"l>q1)|j̭N .P%*\Y8y Y`]\,0SL.5,g)rFrf q`Q uh ĩP 1mJ(F)@)s@)H ; ?&WX@dT#\5)[ zU}La%Ai p% 5.cH)@B Q؈D#[zs>Ȫ2bB n5ij׳!^y&BϞtSЏ݅zsOA`9 `W`C+0n&0\R#SZV*hڠ\;}`5B 2aÔX Sq0)ƙlF2~6]x)HQPNU)ZL3mReViRpYBb)0.^)R#{yV)uqbX>>_,}NX :%Rty vO$zuc؁WyqӅGM:sN{a(6E',Èj9l="Pj5]|6].w,aq!{r兣]/+vaĢb^\af`+l9ÚG¬*G,!~,^IxѢ6Ra' RA+%cP@g86,qYP#3iweրPB Peqj`鳣V+D.fpAigpz B>}_jAr7=uG6888c+teR6?_KduqJVGcF LL8>Y">22>t޽$ػڷSYbMRG]Mr VYLn٧<*v[m0ɖbM= &&Ysez)[>>S-qN)+*'N4aBD %XTL /'Ό!$d&n* Bn$*Ӌn拄>8[nwD%Mg9IL @*)1Ւx#F]MeUG``%ְ{ם$PդԂY:q˽KgE@ŋE2c]n3h.Nm[[Z _^\Dc[ E3(ږhI*v O@|pgyͧbn\2/8`9;h@ -0NL:ư{Ii*Fs#:Af)8%.9ؓNn!,sCD 9Tc5Y?6,ԅ@7rPfÛa ]9d >9jBŭX;B.c2`W汧2*OeT`U_x~DB ~1Za[޻ʼn [),n }rs^60:`s;S VCЯ\[<X9[#IJ*=֎qAV†pIJbf la 苭p+{c{"eVSK >TRJI]dHI`C=Fr[a41J,i)Hr̄|07^9|7N8'C8}ZE^7:PKo_ˎpf9၌>TN@⓮gt[nUKO^5s)\dt-s޿~{Y?RЃ;иw qQRkf(Etw"Axy>ܸYzatqt{{%)]CDGF@ %MS3sIc3.깮DCǓoU(g8XRP)F @6 48~ aˏ$=|EՃVqm7p&iP<$} MAEpM9Kc6Mƈ!ZRBN'M$$0EkcZ9Zp12"L'1X?:#SDʆ C5afN硭꿦[D{=MY#uD`hkAPf8Xxy8orLSd(eĺL^MnmjæEY+fnj=,ߊoә[\, i7L@ai9)יGY[1Lҟ̼G]o.}, .6gūĜa"}_G>kF Οof :?9fHL{m`^2}'@̭ŋg}Ie6IL^gv]nCz_㴗$/; n4;`LNܹ'qЫx2ޣ?| 5xΕfzxy0q>tf0[~/~/fpo\>7^}]0M|slٝŝ\s[gzn.qsZj~|,`[@𝳟'oHR$(>X Lmw`T/ |.fPB{ v<o? o?Y j n{ԕU1/;lMɻ`Э7'k3VA qNOG 13͹t0D(~fy5^YT.OCgF0mnHAyW?os*ˇj\K$lǩP& Ԧbyz^o't9ЛŃ7yYWy=L/M_TN_,n$m x<$|v YUPiZSRv=F1yMOn<IX$F1f>vk4( 3d,pgj> Q|-rJN ?ɮ;Ѹ<8<ΛiUbeyAq~-A1axAɧL_4bv T_D.#@CeM.x,y?7r^#(ɜL8l\2Dj2",&pZg#z&b+ʼnsq';%;ܮQn `($[eN'(`Ղq&V)L@*vBzvT{A'JУgC)ƍ8t8V'yaw*`$N@ Fr"vvvNP&7~>fY,8q4\*!+9VN…oߐ-aq7lP^/?cr%} VMM.6N$xx\X:cUcgٲȻ S"w4e9$ثF}Ur0ZI (H>8PW6d*GOAvջ!#5zͤҒm1d%''O{mK8JIl}IjJż}+vH :aA/x6O xdo~piYcPp; T3TZI] ЃH”Owp:][pC~u/h '>&8*8D^;J'zbлu/9ՍTōi̒(f嚉wr2 |gAwט`LBy ">_\,:->X[<6frQ|׷o&UR!˔ϴq .gw?6?g'. o|\Ȯ{/߼q:w G_NB?fT_PRBUƣw .zyYfK}[ZWI8/+)~ 'Í3.tr2y9Oü:ǟogoG>vM>K(y%荝~ɵd KdNMLqkuˏg,.yhIH6f&ъ1\4{1MP0Sy9<6DMoXSY%T\*#0?>ya័ ټ }_8mzw!_uJm-KͯvLy:xA6SOXr*K9+Rqp UF,L{qc)gK˒rDqiCwE7N5!qy`tXq{pu[asQQ_-U EN*{P>룾 K&Q_=m8YQI Րd)@‘$:ѡF-?/~62}Է-e1BӄRctCSbWRxqe! Hf PVzȗU03*:Cּ=A$DNP^y?sT"+C0eP2뀤Xg,jB@2SI4^wYYW@87|LKJ!|ptϯ'Ogc,M\Usmg.|yw t;[S: I=yLJa܇|ΥPMvlb \6 NZ>l*͌-&7 ս7i@T40+f]U\͈D;j4m@ ʧ7r> [ɺsCIfkxˮQu:^9J^ʻ VI jXqna XD}?p&m24kx7_fnۍ"m[ 0qR"xwx4jџ.jW?E2z9쒮eK!Y'qp$MpQ,9"F:BGn1N> NW9~ǹY|6?w֌ZbpXA+N+/h.*u``vɻUtXDmi້JltL2 Lߜk4ZoxQAsP~^ @Rps@j'iTQ[LwP<eGAURxZp'hBũy=|qujv8x9L9hci'f۟>Pލ]j]DҕTu!NW:[i!Jp}!{,>8ۻq!I%pC]!x0y!FrH޺FC[X:<tG%dG`K-~0BuLXB=rNLr堇ǣ( ÆeEB=AQF?pʢZE /L'MmX%p*zCt99.#ݏg81ݩ\l\ү ` R˷ ӯZv*lO?S18[,g Uv:w|*XU櫪5W|Rfi*XܦgZEs qԄc̕| ]iC|6m|o#¬ޭK{GzCaYWP>La<'.t&\,$&SN#dRn]STey3QZJ';MǖaD-^g#bO-1oEk)UY)ɡZ&{uMzm~Bi9kfnpqAcJW5M hZҝSW fDHNbBrpAi8*]H>C59KtN!s-pjRHT}-yK-~붖U6c UC?$!@ 0q*C}нC$+NmWLuֻ]vݻMEs4k0dCBQ6UY$Cx۰3v;AŃu~qvV{q=L/M_VuBbvAHݶQ=> ͅJ;9)ٸPJbVTѧ1Z3PfMGbjVm#1߬z|3j-r!)XoORͪ\H[胹"Y$+vH2C"(R409jr 0@$9\ʓ}~lqm%8sbsF 闽s3k`:p)J'9s#4͌z*/ J%r.[Ζ2Gy)`8 _֢0/_y%ޣ6'x+„+sQt,;#8f/PpXf:&'Q$/56M!$ _"Hd-UdNE!k3MQL2z.|J\` tRȨh<%BRX-Є5P?M/ց$ʗ8Gj8. 2)Ґ͑NDT<вD qot2cN4} J ف&N>"䣸|[ZT_BΫgvu>i%#1ؖRiog렽XAtEh/Q4lQ"jY=`ϋm^z5ծfj;#?Ǘ=zy\:pʷicYW:}]N b dzaJS:{4tY6+y~s>uEV2{B-&w$ ş˜uK2'NA 2@= p`$XQqmMbDgz;*R̐{C8{0BzEO%Gt3*1)HzEҌkj+?M; wt@gb@B#kwODrhS.MBvd )0<%\d$SBѭǙ)hA 'zdo$s zw(uyiPRڀL:O8@]@ r#4o!<+6e-2GM a>֙hdܾͮq `F)/jc^Pb#VDd$$, iR< ZΠdh|iU:M(Ad#с2|h+QH-(P+$RӍoG6Lq\2*fKmI(܊q% hZ =@qjg꾝;|JpV7Պ.P˷8I `Sq)P24{Ss,TS ⎼0M 7S! 78 (-S)ajjp8Lq"88pigK~&O{i3ޑ✨Tr[Re{)j,`\H@g.򵖆2=c"|U˄I$Fgqt?"/_QҸ]Mfw: nEI}$~kjB5))OCjʥ\iIS#Z@UmK|Bg{>o| ]X }1?za(6 5]#j+'k>Ζ{EuԌ `{ͣBf2Y,zA3`|B`кyҁnÁA+"Dr)]%ibM+<>u yS2%iCDYu@:`ŀs_CO3IC}?$)F&#fPaX\dӕڭSjB9Ta4RS U.CRC>+i2F 1EU2JYL=)mVlEf׋0G51gʺ(%pus]\/=jrmMkF>ĝO2Cc*uU]?>,^QLVFZn8;,d}De^xGZ)nbQ,kٻ}E^i2 /Cdנy]s*ea m[?%(HuzW}/2gc@ [9ƗhX@n$5]JqHy1HRKOnC¹>81n>+ny:Y'+)/PF夑1ѭ!ա8I98n^q+/jvZ\)o&gI'm4κ`0{ aToVų6;FOsԥ|lgȉh@%!It%W_HBZc.W]'=xH0yF-,oV,7Wl=sKF Vg^M@3X"ntF!Y( J!,UH= ILE^M5@VFm ei"5΂rg3&Zm!'whz|R Lif8Nr:ܚ@Q3w8yۉ?n)J;Ɍ :x !zbU.e+'VV> ~j,lJ(h{hh ^<4[?Qg)쀟>{N MVE&gAϪk5Vl5t$A`Vtþa홮9ynS7ҳ_ ]OS'էdWꬳ5;} úCvFgf֚BjsO0=Z(Wk7+!&Şwh˸O#Vy!kCW⌃hZTۃQB&yPgwEy4:?kC)q/FGqs%T,sT s0u1=lȐΉSTv;-B{ /q[9j K-s6 (Q)":[tB9s$ZLPn 4m_e_ 4k'V4;w|8B)KƢF'GBze w}i;EPLBEk;ܨ|)[~oˬHm6 Q~ŬKFQ,ʥkV+@Q%+Yrno71r-.4n]6b2hAn-HgxJTj LYD@ t,85{~l&+deSNhCegɛC;>Iy'zEfHΆ],OxsJlA렵[:MeVL6s~،o#ٰi3nb] 51kG+>dLSJE2v],fm/WBmm,p]eח^s;~I 6﹵g~WRf^}֚ԁFwOɻoCzq]޻tCLMWB%?|ugэA]n6>ndH}JX0|l̢ I^҇ I |5!SM9]z4_mZfKKi4(qԏL6$^!Uvn`H>AjXnI؍G:kچy,>ҧN(8jOHD j'T$tDe|*vɬN(sfoKd>kK;p: =rtKA:1WK=Q}eށ }(mC.9KaGkiy÷q׈{;p(q\3jؙǟ2UM7Cbf?Sddڪl^-/;1Pwb$Y;h<=T(yJfBf'(YJ .Gy) 42ˏCח@d|?~t{3u̽/0*"%,*VX/U;bW =Á-_CY yICݦ2#b˂k)|A_~5wMضk\ӄ9[#VYއ|n?RqZP;f!|N >UĩM}/h[G_i[38 ԩ-jնKmcD[u"{XFܽZeZ*-GvOJMhxl6ѣ/8dpD;Eqm୽k-0H5>yjxqM%?ʉow{ lJ`{wFk60/!^bKxP$wءDZce5,و@aG9ΜQ3yeXA^RCHw8e~P9탒/o)e 2Z잘jE,黰xڋ%ܿ% CC^.e߰:$XEhtȼ b-~0H<ҩT/gl̢-cHGcPxjT*ʹ `oNmTE+v5's*&E Q efb-;[_R8<2ks+f JX-Dhaw.~֜JNwF@c#`0'q1CH@S6 вvàV%Ri~*+Z|LW?ߤҶrN&oPU[OdM.g.<~[sTe}'k}f5nͯ泛B:pQ F. LzϣrlwG'zPrjC]9|f>Z>']_t>yWUUHSP 7[T,>*3ҡ2[MKG77{v;ON( %-М'2<()414eJ ĕev9B)_Jk7ž_ZOk \7 q3^/Klic`,"‚V!PF-(4@)4aÏѵ@KsJ=Z^"H(#"t ּro_Evߧ ??H!+*V$ 1廇g_4_7R߿aogFQvӯ: 8իۋ3F,/n5 juqE5!-__4]s ژN.ޓƑ+^EQyD!`fd`]/cyJ\KLRslGYUMQ/*Vŕqe!k({qw7eb435r/-yrQݜ_f/nxȗjG;Y5f23%{\޺ڶ=12!:1۷;L@;s} j,Oa}#5śߗbb.wg~,__} \nk ^ui~-5nUm.- C+ ;b#x'n'#yB7W mN~- 9N:P&8ݙ^js,KJȌ0 )jjV|hU1M"q_l_#k9g[+OzX-{N<4L*/-{>lRZDW7/kG@ $ӱbiO c1K EH#OT0%Og$7g#|3h<~7$SޫpegoB@_DLCPs[Ob2\“Y:c9,2WSSD)MR"Y@ag9OiQdD+Ld$o=2[dL DrNx:)G^Er. % _$:wnWovs h&f-Ȯk`sdj:v6 W%׹"Jw%XRigZ;uDzeV]#ڋ=Vjnհ =F*AVu b0CX%{D%TL0m4V{ydm㕲)_ C:_cM$/| Yk,9xy_ \#w(N,iTSvZG1VFR.jNI5&梬R`Ty7Cc%q$@q'Ood R$O 娙K1b Z*$3I>4$z^mFo͉m fO2MCCr;LӔ:d6OIsmSB6(dFrLAC g 4pEf;PB{|-+vf< E庈&y{i4)w8s&cKڟ`  `,LHdDF.j6#,7Y[ÕI4~r` 0Fȫ4}'H&K{hL=\#D:O&OnTt?:T٢O=sT*nObD ##&?[\hTHYf(Hޔ)Bz&3"X yU<2Jz|ׄV!|HPRtHhu,@ G\'@̚1+꜓ք>ڽNP bH.8M(hDÅnЄ+Sևz@,he0zڮ::w+AE=JwPcd'3zG(r"S mhL=,(3{猀*x8d,hE-hOD7ݪR_+oBtgo03AC"bw8v(vJl. KiIVmU37\_tW^&xvy^5`qjO/^mj;Q8_~ЫxnV"K}s{:lM,Η]ٯd}63,jwғ׼M=VUL$!\Dd7T-ɮv^~LH*i(v浺k5oؼ*7)Cޙ'it.~;ز\ s@#,??(ָg4j 8X8[Nj1liQFY ݵࠁK {T-Ԉ0Z8ABԀڅ&m^2d\U 5b`@+Sm` :g`zK``ȏʃFEdGs`i ӂ-JP@&v+KOԚ4уRty{VW៕y erAs- -`mͱ kȥmev/W]W?NR@zZz=7 I/&uܦWg7Mkf:z{l3|HZRmV~ѪE QIXXw*37>7>^#n[eJ,>G8gϼo ]u>MIc^nf̗G{c43oڞ_ܑzՒè޼k"aaW-Krkj[Nē+REȜ|ܿ\u^WleVEQtsKv_:1ySɛbLcoLo xFޑ|:"2QLkrd9B`!h%*$[X8[^?^.ޫWgVRh[i]2~.m-9~xQS1d%b mYt|ּ=5Ii,ɭ24~7~z{Vi$xiOPj \W8U;)^hhn?P1V]tVç-ԋ_~7XoN7NR -OfjdF9OUj̧g9^ÂM]1mr <ɛJ_nơzӋodHj+Ԛ{zqo?G1tTcrrq\5oSx|汉LX4́(1x f%PeN䩎ϢWG[ཁ~MnѯF}5߽>W/ &{ ilF7M4|Z4tE/pH)9"S¬Dx90QRjAy yu_+ jK7ؿS"J7;u^cPzrk#DM1zgd: qy);3w00F'}"U gheA(p.K7hKv0+CK-+?>T"dyjeLIԫyMet&!*&Os3 ix2j9]YD@ƠŲ\Zc" )0z^4Uu.Q uM u2hG qqۮ>=ެvf9@{?Gl5ڋߍPW1\gz8_!ź_b3qy j0E2$1T77 K"ܻTךC^f継OS\l3FY.kzҲEy635[QsߔluŲx/Yor]j 6+sl[Vz~_׏b&@_ڣub\TC5v=箕/]wOZyDΜS&!{Q\vFcԦ2OoOjLܘ1}Q??8g6T~=nO~7jFWxM.(÷_֚Su80C>$WtPd/{ ^s]BE1oZ.|/h[֎ 6+RQ՚q$5eBY}}h5 ֎$ҫkGq4/ 1VGuV *e_S\sH%*H(ŧR*Qt('P0I$6NaD`De"c#'JG! hz5l[ۄ Oߐ1i0L5MxMhL)Dl:5<[Vot,BD{wq2Rp*(jv"쑸h4Ac5z3^g?h0^ Ag:FxA+ x@x3^)CBrݐ)S`nL/_V)\X'ݻ[vZx2grZh3؅=E&wArqͻYf6܎M zKܦ!d;h1V&H{r. 8/yYo3^0Q&ܔ(w d1}2Ӄ8j&a5CK|((*ٯ/r ۰h˄VP98\b j܄sQ,Q&Ę˻@R &A)ńQFQ㞥DgO;j+e  0cT@InanBU戽`4# &騘Lg`j2KiA{n-hlCuU|X/a#m[i5/Rۚ#w,&Z!ST3t;fo{yq_`Ȭ%3)nt%쮌њ-~x\@~k.E%!!ת{-t}6ޑ?$.#jT2UBQjrR}7fè2T#*d}S*qL1*/x=XhqdAÑ֜+ΖK x6ʒEa1w#,4)E1a3򌓶bj8ASd{9CKjDIw5ǦDN(tc+;Ǹ;lkRGsϥ;͝.Jkīh<ʷk6kDFn=h~}sAgSשeF}WdI1~0_вOU8NriQ80X]g y0IhUa,UUJNsr u08s5VESidKh$[P댇,;*k] bYl%xOwvɀtMv|?'B4js `T ʹ"os_p%ÄڰkI:vGv)s|,MeTBcaC@!.!DqP˱SM3l)V]|l RAl%DV[EN+2`],&=(Bz=j3z8*G"G`JCHKKN4GI'ǘFӄS 8,H`T_CYR6Js_Sظhځiar7ߠ25@(vWwE,OOIe?^?>MVx+ f#ADt>qu|. !+Yjӛ-:{f6nye{K&0~ !ܿ+ "9{lL_zL)H 3 +2ns}ɈŒr[4qזw<"B5:^FJmp.|ar({b2UǷ ?K=ޏGxڅ9nV ͜}.? Q´R|q#ZY,z:M}Z͊܂]vf#=Ǫڑ V;4c R8+=3%"ti,ᑚ&4TEF? &08w$` O$.I{y>nw흳nOƃ|6RETbi92)E1D8O oZ)PmA UHނjahqJty^'JA-"J|N-F- R"I2N+%DumjX2 aVց.!v]9(8Q:F$B6BE]̸fi02u sn+M#UcoslBF`Z"Tۏ>Up=?iw8KD'P.IEPeSRDa}z9# F@ut=] !F|*8_!,Ri`c0lyT RU[b7..g] QUKE 1Qzw3\V%q誜LX*@!F'ڸM k8x~x.aIRSQn!rt0wKy{S81 7xn>?Kc9]əÕ}_>Dj61 wڑ ^߉fؙI+R!=gGW)&3?vT? ?:{WXiHQ5t3u`?~]#\f'Y K7OzI08Ro΀/xoggV>1pM/f H= cS];|n]gwp %R8"%eRɬL̰DiƭVc'8RNPΝ%#x/3L7 t3 [T/%0yo}&0~f"ߝO\0!0<ڇ}|fx0|MnQ6U>Lg?zZɰbF@;m\?o] =oL *X~i>]zߧL7a͡UW B42J %X1`i<ע~VmG0vA n̯DZ{W Û?D~V|+|!?ڊ|D>ʹ_`y\m~ڿFH';+t@h#(6r,xIjIHlyǹ<8Bs\$XpY^zCQ^Kig qDZ $< /RIpDh}1vRO`;'k{ f]x-kOE[7_^C(>ޕc1P:V@zڽ*x0wxo'.pxrKn<|7W ,3;Uv0Z|ʋ _ BHȑ+{z^s"k]Tڌ*&B` Sw U4oV=:޽0.MV#6j&e(e]7PaWc,|}j.!G8NtKa`ߙ $ f@mhPጨ!o:gxUP85Eձ̯ȍueqg V[큖nMHW.d $MJV2SULI(ېv=вڭ ELA;n0V2SUQmV{e[h@v>ƹݘgZRfwTLXuϴ@K{ HW.dJli7O:3A);*y߲<'Lgڀ|"!SL>6L2y*0X%HHxɑ>G!/i>b06cOQ< t>ZЙC!4|aM}:0AwAz G=Vrfb&d3DN(~`=-CvE}gU&]^foqC3 \A25oy_o~;$.XښH#V~'7,@+B|Q(@# P[8<EB! ծ#Ƹ0La'W46VJ^OQ]4F /s7;-/8hEe<~~^5R b3CfMB|c\]v=xFXܳϊ{M2|m:k4Jw;zz 7.k:CgSڌI\)/ O `Y2 vL B@穎V~l4yR_j6a D*%p<~\IJ'%0e ڱ]0@L5=IHȴ'!B2Yo8(x6qh[0U`iؠ^oBy[*Y 3?Ba/h: bJWAB+tV5}Rn2>g@/uROhWϟ[y{?>QXc(\&u<< AmJ>𭓏% If[ Ԛ`B87bM*.MUM zyRd#e Cf3g~ђ-uF2XZ]@J"> !4Hme>FA_0oKFZ+r3+X fjFK ^vﳎb+λz11p ^Q=(y(qDx_o‹Qx/h`^[(#hݰMmL{-J=.id&r{)ϹO2u.\+N}dKDMkvxCD/ =? n`NA!DgMަ:Uz{~!CMheFq֧aSsJߙd>eCˤsi-c }lPkVcXr(ڹSPB,@7ZY{Yo8 Zj*8T|  qC$81.k NO*ɥ,詬C1l/]to̧Azҵ:Qv8`}|mtYϓzD()@10nJN`(L283d,sw>J<|e\wNxq)Bw0eڷw.4L.2K &f~z0^}Gez!͞n%Os%hlq;җu;KZz|BC/F'=2Kk&S(ya##UZ#G*NӴ}3PS*>L\y:R yJwN}^GI6IJڳ1`;ke@my2[|< Y lO4.AWSA]%ϩUPPn;FI|pVhD c?tdWTKqՔY<[bKҎ~9'Qy#kd>$GUh'Hv_lhn'URƼ$%;U҇AۜE7/ώ'8|皦OnC~4fc;: f˷ͷb1t?[3GWiX4̳KMcOHa[RZ /ox^ZpU#IrxُC/xz{bzӏwWp'|#< }{e?^`3Rl^;QęraLjFWoݩD*1WO3sg8?)cאwJ1ĵx w%汶ZHٖLו2{젵a~ա1J(#NEP駽⬬h>*Q^a]};Kv6TGK-}TD ֩"f3ApX2'Ǔe?v8X۟<Ô]Dnj]v|w~Qd:ҍ !Nc'7r}JGSK g 2zaPG5PE>+dQUQtf9z MK!ļ0z0qZ"wDKD;yJ^ pQa@$$Ƨ>K Z%iIێy.B)"-moD)*4Kk'c]9vY'C>K }n$y!n+iNVDttD6x5i" )' t,lCK59.UNl.6f66zwh1u['Sܯ3J:T㻬GImSH JrA 'iv¸xQW:tc{ Ƃ ޖጪr rwjUp9O9/WD(W6uٻ޶%W ~nA0gsʆ. c"܏]f]tKsFݒ,ߍPtQ")3G7!]MN?7 t}&!5HH8߭FvP RFUR 68@vp N:Еcr1~QBX-+Î@܎?]{89hqюݴ9xEYEHz;Z09zl"U'N}%c+sjh,V)*})ZQ(֋̪njL-HSu)SKQ8YWuxmvj^U$KRS7rQMjMWn"6 Aؖf~k%nrQ ۪޾X뇐MύM-Zvj!9Ӽppt};!Icbe)76X3Ж^#ot O?`*j 1DŽ0wTR[ʩC.( O cϡhxV^Al@:^˾Z_|Np"{N/ ` DNg6vȫ8J6qitӯ8e(TVA`fސ{N5 @ tL~(4K+Dv*w۸'75cfɁnΩ<"6Lټ:bJE'54uir9VsGiwḡơ:N<\٬4p[S`i'}!d:w-A9λS@ARd2u?oYX_.vЎdH}=UhncjjA3g_ܵ/I-WXL:< =fj.BڳDϴKpX %Ew l\NG\V)h",)Ծ9Cwz3xX2:ۘ 80\ݡ6I"<ēG ~覚!)}". ewr`rHhkށ΀8!D:E=}E63N齙rD P &CiFyp䥮%xa#1' L*mPo?|&ʉvܦHG}h()~P0>c ?aOo (q0NH<7"ɔD+Ad>u9iI0ʨ׋*qR7Gcf-t7ύu83*PSm)w'7L8^g`YB{/ׄݪ,p}@Suefum‚.1 ªE&K`^6d nr^MSmÕuxޫa',iI%d~رQ*(% 8@ˎrv߈#u K(Cu8R%mPȨGwKnMDIIpO0d`ڑ-^΍lu߅0}?{ l4"ěG3Ky1A j+@6VZ\ `yjs1gIC~Eck f|};EsL_>@/ݫY `vĝ39!Cɡ GD#q?*͈zNSSXki#P%(@,}Ջ 4TÃC]Mzs_^>[kn-/VX}shʓ(|2t=Ogp~th/V~Lt=nam˲/VBt nAڲ7B%&aCfQ+굚'uFū^@Rס ˎwXozn'X][z"Hc;U/VT湽ee 4Q a࢔dstIDt"͝p)qօz֙<%j58͂ݭ-= ^ږIXiUtRvB蔶'ת)>xmaAXw+*JĬ>Vx!JJKq%pִy%dzOI % m }bm#eYd "^ C5(ޢ0ZZZaҲ$F]h޹\V.?nOj #cy pgO8ےҾlQbLm>΂lZNnGj{ROB[X!w.Rv~OlJ*JVV@疽E kpH 7<չK-ήOevdN==}ɬ$N5+b@e!D[m}AVS.l[d9֚|6a7; Q}S\ޒ , sT6]*ڬ uڝ)&k_ZQUoJ .c?3|?N8zμ7;eW!p,OhV!6Q%ο(p;SQ  0g(c,h@B7`b[M26f|arq~)Gc21ec#F2%Fq.8 HbyG`3!5=~"r5S\g:a>Yrk9z'<3uaPPK;7joMY\ɠeam7DL>:*M1!N3;3PzGf@\v=wwNY*>|v \/26VUJ֠K[~|VRlt|nZwߦߛ0[XAJDh*xO+MjPgAõ*"^\D9sn f]ſs Ӿ.Nǎ?_۫qS#[eHo87UGnO#jIݑQ0BP~ G#N=09BGs8#EΩ:0cd'Mr$zqp .<ș?M׋;&'΢K~>_ꕳ}7fUr?ޮB /MGm\8?s5cÀmX(FՏУG߾gJj 7[q2 iׯ͇\Ӈ7o/_ڭhy_>yo_]~yuQY>[GV;r18?b2&~{t_*-Z W)\5 :bZ)?)J)_8RpWq2(0ٲ=:բS@INvtLyQ}\wX\8ԟlP(~^ ڮb{Y{hl陣8gP]`ğ7fmE/tzA+0ӫMk\Hgaeʐ+x"P%i_#Q&BmW =%0.#}_iUTSCZ\I f$za|i;7z[jVڽvrw=o8L_8p%u ;"߬f7~o{0I"F3D{ teb .NէVKz \2xpNSzOf}6 >=i ^hǸOM1<2яl:Y9'(HPv-Bp( 鿲O=.:1.u@G'S Ec}oV\"298Sm5` ܓ_> _P+}Z]wS{mG5R $Rl?ǓC,R1/5^%4~͡mhp Te t?sg9θwXG>LXLqn\c4g>TP5_w:Z;jQ :=iXSŘVh>sk7z|Ȣ?v;xgF&Bg$BxId/\6:zißx&a|Vwt&+O !Ig$}NAmؙ| a$ p bF q1(f+}{fVc8ky0ĞzJzEL`6 M6ʿ?x\eqE51*{aK cݯ/Zb`w/<3TWW @yDkBw"P/$HKj1糾$޵q$EЗaw~T Cn oOu"e^!E)RPAHbsWUh4͉󙽯#}Cu;5-VaJ5iKt^Ēq ,о΢;#TtK|f2Ka}kS 0B8+x\Iԍu^,dsnp,EǞ\ 3 :]=-~07KUw[p "\.6mWbkM-K>3lh6󷕙™3 ٷ~6?ufԏCݭV֊R`z5M|G|+$&aP$ദ kLD Z.hrtk@zn: E|2$9L.#841g`BAQ"NuTRPl +!GMu\x|:_MrL76^}=Ze)1DT6&έYhCOIm/6-ɻ;xxa ۛ"o'Nӏ*P^l<=?A]qyN qkJ _^uBX4cKIqy?.@n&{M[mTdj"YNG 4uf zOKwJ ^P{fOԽ aj~{+ x޼pJ/CQt؋W0Y^ 9h/$7ONOf9W)OyYιO<߹! ޛ۔1E}hܥNޟQHzyQgbUxߪ|=j;FWV2kS8k6 Sg{X F:Fܘ\t@Q28 J!;ix =>h&\ J } -T X~ M;p!hA>qx", _~Aׄ Z?JW |40oǶ,ap*8K7KTqrs ^DPc 6Q xHx(pBC'&:=@n+۴~낦l᭴P!e'jFA*)eDȐ<#{ ;Ƿ[6gϟ1fJV(IH#ZV)"x!LiM4wS b If>hO̜6|'̟Po/ _zAÊLyX/՜q>]瞗I C<[U(!cۜtc_U7* B)se(Nmlmy P2?Q@L˓|:P~]r턼n'xڝF 89E+yopĉ_|iR XC1J4oHan0f82p-'M9A'` z|:%X* B~w8*_ʯLqcڍ /f|??%LI*{pwUY|a0iΔᅏR:SES*6.,ՆwܕD؆f$BAHLzQlVH~h$Jn4uC&(&?_2>Ҧ'ڀhm?q=DqЍbu 3=^?NxS“%^oϫq}paeT{%ϓGn(<|SU^wx5Н'_I*E XU ޒSVoxT}L\$)GFmFv=S2*x"}]|mx3R㷁mUM⟹~tz%+#ݺ0md]Zϓ,O3# c#v- *N(ifN)k3KPnuNSǜrځlΌcNS9M iψ z,>s0ȸڥ?U|.\?ٸ90(b+Wẹw39"aDeգYGQ"C0ɤ5A ODh8$hKb6A[򽦱W4+ӟ/l4 8`9!qɁYޱ 3sOo YY 󤄧79["d#gI…QCwcbĬ]Y]M~|ۻܵPGvu9ʒ >Pjq {(j{@c9:(Lq߶rطE9ewda[)cw؝{iEQtw3e(Z4i<E5vr?=D5 `uۣ7wc4aN l1En)D]ql긽K _*.dHՓJTk.̼z{3 G |4{_Э|٠BKZ?ȱkrp%F7B d}>LP\ 9jȵShy7 BoꎲݛˣlF>bjP 7 e|Mv|n.NfGNr/V{͵?䆃>KW'W(W$kG,n_OE}nR;3 z*Y!M饦yM ju}xMnSB.wAݴZ~=:)+{6r,/ن>d^v/EΧ"Ym-=YGY:lX*Gıֻ/>>Y7Uv$!\DdJ߸Xn}GVJ)}F붿ͨSznňZ.$䕋hLvuL؃n4gn҈nňZ.$䕋hL-3MA+z.N!>[]ݽ+u-B.ENoir!Z&%e]9psXkk}I5ЯQUIyA[[K*4B{Fc( LMP4YD))7VDqjEibG#x6BRg ,7_yM.3BB^ɔIVƖ[) rDmެ[1֭ y"Z$Sh>\n[) rDm>!MnňZ.$䕋hL)e@B͑dUVŊ5* Y7)uZTdɑL8W 2M? 9ԪB+HGNJ<[@pՕj#j]9hYԒm(+0td- QroSEdSh`PWnjK25=zRg{zOgW2vEH'YnRDť9S6m#kݺW.%2|Ĺb43Qq)A>ugCMwm݊͵n]H+"jLKn3 hq!(X*TUIk}]YA*<:SPi]mf;lr= {0$P.ջM02gƅy_0 ^~svgr4zCed/{ȟE 8\k1;ܠq<ކ/8ZL)yxlXa2abuerwx|Cw?/~O`tu?UN2VFƠ9T37Q WNq:@ͬWC 8.C-a kv%6&|x"RNUEDZ7ebYS j0jAS$f&@B^ɔqX7J|"v+A&g3Z1P*F4{ y"LI$m/n j;0u{j9 y1)ySS:^tvFiA`=GiHǐY Eɶ#nxʑ7w`*Sцq#ً (4XG\qmPնd}VUc+PS$;./UdN`oL>gvwso _D`V|5,i^7KR9WrƖLi) -vɖ`4glm0eLHUȔQXÒ.4ݽ;0^XTzFa Kܻtl CO:6^,RKMi:ێ)=`4 HԓNXҨ۟tcdc2)uu n:a~7xZz;|MM >YU7lYs<_zzt \!OjKvɸ:j&_+n?s<՟/fa68AqVo*wL7Ԗ ~ o#BV#ي+{A"+ C&/~,cP@hSqgfe{|@u %*] " b@`Ȩy}ltKnX˜seg L AiA&-Oϐ9WO,mxf{hר2ZZ:~sE2~8̗ aR'x9V,?+{ͅ;̼r C6FFJ#HC鎇u<6(g CZ oa&T(ᛁ[X>dƄF_~J4f~bi {v49 uP]OH"z:.ׁTiX R6 Cm|:= :/Tf@r]Wk4{4t>eAQ-&J&&dٗ|kaF-;H{|q槭DyXm-b6#=5uoz˳4r7pbc@`8q `&gaZFуoDvug_&2`jf6P{B|u9FmU9y2ځIJ7ղXbRߺ"Ŝ"ݜ+gB;s9!'#g h5kd~Np. wqcz1brҺ2T&>ty:D18P;HuQP8]o15VB洃No<E"Z:cRDCkPv9mMkk%xx٨`-*\"%^N页rhPH&H[ψ+XZ)՚@>&~+SkS+7"joMhm ,y,D:_OKY.W[[5o,NskAL:W|i.|P4>|ki|>t(SNYj:*ʨy˜Ӈ4VaC::tECFV$a{zE:Eg gIÕVRRDF-͎z6d̉XhV(3(eH!%HT\` V^0%/R7ZYC(s5'ѣ9.Cb5 zFB@BZ!"ohEo",&f 2ʱ2ԨU\X'PqTO57B10%5H [‹IY C\ۍ%ro 08. 9 $钒<gE N1}:(M72ĕtC\#IC5dZ\+Аr*h3](]x~o+d_Pv=_֞/<V;T~8U':X@LX-n ^FOy yR(|HICG^TLIWڐ%$C]F4Y OH2)I>ҞtAlEG-h-Qd4+(ۢ^Ȓh}L=Z|(\lmv}ەNP5]޽O}|4|d[/"vqurX9ˇcRt럮Ǘg)1.N#5RڟuEn:q^ ) 0K˙Z B|}yIK%/Gz`gy9;d¦,i1i׬)i/{/̔;Յ٬t"D7D`FUTe/maZmC~}%ٗo=XNPX^$Es!TG-Q8HUtMY7iCҜ+#-%wIYN š6i)nhGɚ>cӾ[BH,7^O DEnelc09}vJ틟G7qzr'ǟn]y} dqz}>;9N^O"zv3#-o_`:2 gGc)eB78sp㒞u2{' ,u_kCEd/G,/I/kH%CAqBzcJ`5CPL,CAS 3R B fY@0.U;͊=v;Ga6͎/?k!ݚ> MwV]4pg-}=k,GW"rg-lL(]8y~oc~fs;ƪ};1Qށdg-ry67l_}V] xMӸsaS&D}fgŠB]!YߏǛ=ޏoL_|0%Óu`je'vOwc5'@Q|]D1;!aMs4+CϖpsA9h<)WNډaӝCYMIե@%әI7 `4rLouÆ!悶턵΁9hHI%U{u"(OlqIծRriƙZwZ=~c=q0jrb NnU,{x/gk[?qNWmU87'3BI IA1U3]ucvU(`C ֋|دX#Ld S'FFbZR\J %h3&D%f"'̀SHҚ¤ɚsբTIqfD+ ĻMR+6zݕ~j$Hg`Q<t fo.fSSܷlX7&\`/6QfjqR|-w4i pu((w ]';? 3gqzS%e׉ӵE:$׵o߫qq:Oէ[bNجeqԡO%hw} zWFq/C8|"֘EFnOlzYŃXo..h_.tQ.coL7yE ӎ;P۽MCPy.f,BX1x|I*77N!:r .O24u ұG[.G||x3hbesY+rnπ MhVLa.T9-Yx%T@ḰVYaNb7 +xYzINuT@Lj3~m^ BT7z^`n cU{T-֛GX"O`'yZ]5V3Ǒi8l6$EE$J=;L1nR0@kTɲ心l:@pZ⩫o;ΆhĦ({gwG $N^y4g4̨8l,t, )'&')V'B1R4іcIŷ1)W> \ L^\yd9 I"ab98 %FqCDbu)IprxN,&u{gtE4xݵ0v\Gs@ {MnHXkaڸ1 H*) ^ ^5durE]wMta*65@ XY̕\Ut`!.00ԃ(W-Jsw.d0p&9  A|RW݇Zqj Oi-@6gS~|U[Uz ֍u:c@gPT Yj@mu NQLw?ERGIS /X(!$k+ߣ!›'AOx'H sQc s6ғ .7(KJ{+ UH& mn!kOm8&(VUJ뮿RA6"6Zb+O Ѥ:I87KF{. }k*:CWqh8J W⎗mpĎa^]- '~8nfy=0:߾>M@] ϧ{`Zw_:dvJ?HAҪ$PM0~`Nyv+xqW%ߗd @l ݏ~܁NjյDkM1%Qpo, JI5*Uͯ5Kꎗ~q+VJ\q-.ܝY~nLh 6֕[e_T.Wfkmq/}mz!ZIgx6t&ê秖N1fr=R0"ڠɘjMp0Oin_r>uj*G, W / g_3bo+A㜏wp1(OA\'ZJib'< GSxXᎩ73CƢNM:T"QGWs`.|.ux-8C݃uz0>:>oMW"ZCn91BŘckJ,QU5WrU}窶"9886^/{L c.=s9I,K(IyסT2U8U! BpeJ` F%`ԤLk1^SE(N|ykKQ~l2.uVٞ[8_SFw}"j8&O|2{,ޫ4 §ͱi8?W=a?׫̳pe~M$dǑD_֨$HSi˩j0Ю&a4W`3r>f)wLc$F?ȹю1^,B|X4.8_է7 jZy>= ur# Q+VD?!jG:VY߇5h:rc8X[p-LZʶ.ZZ_MD_RGw'5-oJX%/2`t呻1;ǭB VnfkϦaOxn Yb[UH}[t.DI ߳|%b].\Ӽr.&dUl]AOK"@7VAB򅱝%@r |*\YL&YNPXrSa"vv7\z8 i RX"yOZ`_ӭmys=(:^ۋgёr\[EMFTZ4=tJ:<kDL@XQ*hwjz 2RR)Nkʮ߯ƨ\+oՆrN;0S`Re7=01} -^ CPyEfqE$ߥM`Y䤸wܵW(xNcg0 ]3`"ٝ_v)k!^=89Ph()xXH4rk)v CHGQd.S ݥxU;]KJH$sٯsh6ðjwĈuu}AGKYoiJ'ˑ;=GfTЭ 33r$vEH*"*"e1OM,38^q?Mx6đ]GV,юMJ[:[2s|okí$. V< O&2r!Up5:.,";^D6f}ٴS,}-̍¢? c1eUED"o32^#xiu\}Y4>3 #* *OggqSG=] 8W?&[00awӗ#}0fZ.[$ԆrpJuܸ9]G ~)Nct;3/F#BO%pdZ0 *Wsq{y28@e-_,xI5!^3VSFaM^_4v3m}(KRˌqDE^20D-oR-F RUQAA(ѫx++ sc6p6z)Ɂcs=(K ¨ BM,7~Zz%h-b ҊERK>@eO:624(ٯ.o?uWZaqMū"W3ƸU_V]]Utbn/2'@+_盯\&QUޘn[ ܚ%{_˻4⭘muw{ x{S/c?}:u&aǻay}D|Q6HZ{E([=`QYnFA\Gsw!:+ TkXU[{jӈ+v3{klL|jq:DtK4XkQO\cNHt#vbJK+=hT^Μ\IX ?Q(4 !xOѪH#rE},l]eŠтifEh~ M@%=O--5ӽ+oHr-6N#2L)iB\V8hkׂq g9AL5BE5 |,9F:٠ QP7I",dILVӶdŽDc Ns`s(kiK2 ,Z:8&?a:hD:V; )hfL.#B:9PǞh}"?}c$=H k$K \+ L,ښ+grٺ4^$S^~ҿތbO?͞&;,jS]-}p- d`KG"j/Py![8k)gRIBKxv Y ̉p/R٘6y>c}1(ٖh#)֎#qf ̻״Wŭ^/ ֝_]B&phxwC(Yח _Y^$Ӏ}3An9a?.>3֫ϻg$ZȕyDoD9l*Po/Jꮎ!CٻKwGipjF%dL#mͤ8{weÓ+d@?gz_|1ݙ6 L"v) * ,7m}l\k!Ab6Nx mC yA bdgljd6pY#Oڥ7RF{Pb\؆+֓ 9g5r a,ҝ8lSU!jF!nZҰM+mPp8q`[$%Dt'GwiL9D6C+vL-G3L1mWn+3#~+T"%&ɓ6;TU&}2;IgϳЁL/L1n.4?\eI~*X/wH=嫯_[$@=NzW=y8Mgt헋: Ӥ0Jq UT^LC?9~ncyuo$\EjB9N(;ㄮuQ ШdlxDV+vaq=cGx?yh$CTW:.rO)3Gh rk Wb Gr b̀mVq(} z=I/qX Ţ";qv0 9z1b/v .ҧF`Ǖ?iB6 JXcRKJhFJh׭InT ѪW{U<+B1{ݾu4FjdžԼ뇥/wsXtuGڋꆐo9i_nLiGek2tç"}U bϿN\E HLr,OPMxgwSǕ%Օ!TVIe\07qe at {_}l P?* #ߓXL/{7C#f^ԗ\(!gNzdŶg7A=m/L?,siJtor% 2GW1CɉZmՙC0l37+VxXI3 V/x|Bd0vb"SNP~$b{p|@L{m1&N۱[Q"(=9ŃJ,J9ĐGko3}U0 H5 iJ(w .x% ![lʹTڢu@Lz ZϜX9 G7I(AHod꾾^j̐V?W]}6 |Z, gZ 1 k 'W?Y@g4'm6J \Rnq@cXӜ݂0ZOsD֒P o+ JjU ̝Y<13x2p &fj\8v#ςQFΘkk<"| 05"A^];wwSFͽOajt2!6ۂL ZE6*M14+p{cIbBX)줴?) @=Hջ /O8"S˶;j۟1r/tiX RGw3 }Uv\3"7%Ti,bDڑ'_`X?\vǘd'n2R,QJJ ~zwhY}4z0hG>w}w7y"@;6%pn5EMdL{!t[ Y;"`4y]ArdgA?m[6nbw6^d9xJu0J?)L]tlLZ\D9SM y $QS=X \^E>mdDIcIO@mhZd jG᠔͋C`5Y+vpkKLs,Z@tF*PΣz%2f]AW%x<C?lv R4%< dv|!>)$L.KF̪ˋ%%Yy^@vGlk@ +`ՌU~xm% spVWPI/U+=%3N%[I-8w0l4&;G.Ϥ&`_qè`P黴V7؈1+\ֵV;&at82,ZgEyj횲l^zN,^w~;}#ai%x'5 D'k mPyc*t"YQzxH$ZmT$,M#D`ٻƍdW,vq(C.gsv)col˱d'栗-J(֬/YUuXe1 jrRQW'!^l{a5WObԿDnn}LB/.Wq, Jp쫿z8%"q")RRxu,cRUqaB%TE4@*)m1LjyDy5K)qI2F#*& ,'Zǒz)R"GQ-@AxF5 -_Å}R}Irx4!pGQ+rbu,&Z {^>X*gK{g o. \ (TX!#h>9 $ŃfwӖiIXJ(>Uk[|?`' @RH(o> 1hO`)6H<\FhO_AܡWkb:d 5:rJ(BZ済~4j[YIIGi"W#N> yxp[ĥfG8ݒj8c؁;mfKHw+3mE^ .ti {{6c*iK\X FYٸ:Aogc끨6ep<%%6n'ҀP mjIr2vHQq^(|,ԍI}E1j gim!K0Ֆ3Z:fu: Y)={&o 8wn5?=EiNܲJ ":$\8pd 8C& ѥ3ӈprh*R\OF;G):-  4_-Ѳ 61B EiRO48) J!;/8sTx#)&m;К0B<їLiĐ 7F$b@h1-_q艥_}vIdJbW4Q/!hZ$ d{a^\%\|68C4jN;g}{v{r*N(B@)HR %6(=s!1 ^FÈ!0>Ћ+9O< t@(ib~D%1q0ԙn& qHH6T= hG_l%}wuNі`c`8'N=@I . {RQXS\kC~zka.Ή~;v5Y\XbvO>!BCcjeK4(5[d֔kI"2 uCH#$EE9EHW]DNLj.\ZvmD  #ʁX6cT aۢT^0Z0:dNXRd!šJGbN:lYϠ*"JA5&ˆm=Qx0( PU2͵;UK1q>ۛShv(^T?䆙ҹ#o a.9ٺ@0?ǻ?G]cчZQ/iAF*huBKiX3@o29\\hLnGPHrcn`gz*%/sbc8ݖ8<-{oOo>,  |%?M/k/{3j1kb-\JpL\`8p"*5XPP D\$[/^B t$(G@EϼW8L<\Cu xRQ&:);ehBKb'LI% FYGyqt-<1GGر*?B(#mʥCCH~:n+Ɯ3:3 Q}=fx||<4w7؏%x>r圌 g}!ٻi50_>ۜWxl '?3-EiLl0'av Ep= ElJGoAp :wEXoyk\ _r{`nU`p1W4uhph2Y3%_#ʐ˵z[kZ.]jPBj\18]BbN_|q[e{_B=~[+[׸Lkzw7Bz4.\ؔO#\1~' [n"*@y0γȍ D1x@\]>CuIVh\կo֩Qg0G5cy‹zYEtqv3ZGy1uWmSu7yEKSo ~:{^ sqȶAg,ۄ囇I?B1݄<];2֦K\.{{uI#ZesIO>YI/Se^ v.R>=L[\[Q 95 Šm1wY֊Nː (27RGlч҅rDK[1׍ cfݢnAPR(ks?֔>E|u;qZ4$K$x]E_riۯZ2wFc;lq\3ZN+h0{] .gw12cL߁6p>lÕhڀF?{ a%j*7 /z_/A~lk?{tfMQh9,MW\eCr#AoS( zl!gΎU^B^ȚfKo`+Wwu&i~?Uoy:YFY{+(yB1"'+}TKw2ԉyƏ7f)h%(+v:t{ogǤې߫ZDHG,@'\RpaɄzan/&b#<7LoCOКn~RWc>~T% ]@x>?lKϿvxUˉR{W1 'c+W**i.mPlT%ELztYXOEkښN%tgkT'ޙwiΚEEyt8jGUz<er(;`3w rya=cP1-%&oe'2Cɽ@x ה[X`lggK{ў||xC%F=ap?]FB0E]z]Mw`Pr Fka 0z`F3VA A&_[AI9BH. BRq سAjeMӞN,cIh<&nwh!a)($j}|#AQ^d鉶|7$PJja;Z*<1F_J-x vJ Ch?q8~~QD5NNQ`8/nlh!ubhrYd4:͛ǚy8ȳ8E`oWP׆N豒4ă5T%4\j,25G} gN y i5G`yP69,Wu+$'X$u/i6 Dc+dTQj $ȬaZ/#aDHmJq 2o#zù_y!|ivSʗ\9#S^%.sCs#ՋSTKq#;q;5>}xa\(G`̗`q!v/CyLmПKd~:dYliRQ9;ENb%.'Xʀl= # IsVa( xSWiE>Vh ^x!~’;QhaV1l}PusJ\Ył/S)6꙲AF$-KToVvw μT4o10tXB`?{q!~X$^ l^ #I!)9ޅ{NlꮾjıG}UuN:c(aʘ[1wP~ԒbQʈ(+XV!JT붬4.Xr8ӒP[b.Z=|I >sBTqr!N uX0ŠڂW{Ȯ64̣`HE ;e Q1($rނG$2k6}N0oKrKǍ*7)^hP@AKVŐOfTE(,8)7n BXW"ixE|'}!cPK_/6^I$DnNBcȗ'bE7͞k?WӅw[$g5|r$іd עvT2NEQ;@Ѿ}pӊ.S3CrZʆ$EѹC|$eN8Kocq!\YI4V+7h8bs-J̸T[uS͎%>,GVԢ(;KrܚIhTJ9lA8oE: ]p26PE2HMuEG!RVl UrSؓ guDJKm+Y´S`FPim 9G\H>\ZQ{ӊO2QҶC&JéiEatVԌQAKBhX"Atn4MG4\GlZ 8V-nW/'G&ւ Wn2qrz.a X0By+|961 bA疔kU趗Ƞ3E 1GpO ĤeiHڂIme4ARB)~5_qUDL \k,c*Q6GF9NFj E: /cU$JU ;lxRj etVceoF ArT"TQ%<篿ayGڰ*R)N*OmѠs" uhI^Yy'3,S>>V8i2j5zpUj`q85=cfR"G>fH+{Mʖ.G8M߉i/%Bo;gn+(`kFckΫ _[;A "E2Ѕ%,D! +ktnM`D^(Og`! jB$%(#Jq`2Wy#̸`"/#$v6 0V{yF"EIqS78SK(l*[U敢#@Z (r/(a,*Au8Sgz{kVE!_"34zϜ \3 ХdPZbqi>z_x_5lwnSvl Q z"df5#A&h0ߣ"N}_w4Usfn#M`fcUVÿt`` t;x&q] iLҀa-X!@pV`#' qTR`TPvBXj>{w/j Is)EJb@I J{ EbkA" u t줿17(ߤϴkjMLb=XRXDiEnW7׈h@?vBTr&>)]sfOsD치#ci}+HD,`Rc,t!Ee?:gj%\8_.ͦ2g\`pHf!t1`ωb8ô\"XTۺ!(w-{bʴVyRWLr= RݜFEEĜ4RB;:,/P%ƫYQg]H^+5s?m7EM1i٬>\lvG/._E}|;9t 3,߱_ &Ƃ ?# @ P- (2(n h tlŒa!Ll_bF/[W<a~VYlM(%EFrP~X8dZːÅ$\i ˈSاmA0Jan0{`?jɄ\ )߀ki$]_㭥4eϥqn @u)ܮF<P^3:(JƢj@Iqx:R}1IѣgOy, I 2p*GL,y'Ig)E"EdQ!H&C *|π_-9vHQW[FCi1h<3P2>+f+YHw`!a1%cĔ B-'PCs (j"}/t,%R3_۵q<*$ЩݵY(H_.b8t!h)]Pupg3VJC "'#dr/ ٥iѮ,K^4ΰrql2Nc.{;sfw?B(Ԋ4k~L)-zkIJm _nVa]mSh]Tx%wKW?mb]b5ĻYct^;i`EMv:X#]]mLtšIuL~ɫ =MF'Fn]Swqcrb̽hZ^/CUɟе `·|:}j{Gno/{Ɣ!?ȟzhnn;~ap5uP?5K]oi3sAʕTmz%wLĎuyS4/ѱF9ze9?.:E66y_fe6}x/4gpe}Ws_4 [ػ>~}t*}0BxICܣܭ'o4̂n2oq[z24}×ae3~L씉3Mغ7Of7ެ' r;b~*ihwgz@F¢Q_k+yrc鳰5?ۢ/Z*Z4z8( +=ZH&8kx)ߡ=d p Ob2|9v%sՉ6GpoF JG9$TYsp5M4wLRr e[y8XDO_sSW8Lhw./Yt/_6AJ[qgtS8i6㨢Z* ]#DrD_or?6oQV)y[sV0rj]GRWzdpJ0Zr}LhL`6edLxL||<|X<ҹZo;W-ңY1`J l|uC6DHMYrq ]kV]fɊj+5Iy]_Žj]}~үtfxHW[fG92z]mϙǸ93KKlIK)m\-9}yTKjVgp \ ^ ?9krv-}>_=]cL^aCﹳfz9_)e;C) ,f0mֆk5%m}#:J[bUQTF:H2;##dR5vp/@,м}26V6k\g'+6^Q:G(B:/v9yWfh՚6OP޽xm`vG'g8ݸ'ӑ>Y!4._{}2HJٝbgP7a7АC;C]fPZY?/l`9Ю-жPhWzM{pCƒL7J 2A6]>#4Au׵ 3}E7CAS04c1r?n kM9=Uk]zy&;f7ECCoYs P1Y<'ϋwsi"x9 B8_ߋiu?P](Ὕڟc)<*S񹝜-@gĊ-f E-Nş'w/)J8go~'_fO3ħ*W|l8OFG֦Dbѐ!NHaX7!1p^?djޞ_ F'dsXkcBѭ2x4!e\rM qW6&H*|ejՕۣݔE>Y-܍.]٭ݷ˖[mkۈ"Ro$B*0lA ϴd4;W>L"LVY"%b#[5w{pXĀp 6* w2(4%dPi9usE`9仓Q1/ȥΏfZsu8 $XI! 3>IhPְ WK"RNKB9~ô|<.D-5KkZ?\ "Dd4:HcP*F [j}1h@ŀ6% bP:P$ВP_LoFr-͆S.D4Z0h$AMfsY&5:iŪdU n60|`лȈ.S#g#a10$_y4 $c-V$Fm5$]yPVt ƛx .JJz5o⩕ܯ\9 \J%,L|s7mRҁ|d"jEw?Q*@5HyjeP"AGhE5ƈAѲ -%&hI+TSf9hVw,KPϢEq>xW~7a2+y}"CX;'VNyDhЎQ6EE*?4H%K f G: zdq/lB _> `K:br䑎IkiB91..d\ Kn[?@'L0HyY`L0Qc~"88J"fF-Y("e`_shl13 *ŷXJOd!=O<}bjȀls֠QF"0G \!=xz(ƖV,䔨g ^DT_'F#2* ҇90h,Y9 iBCT؝\olL/:w>wr~3x~fƍV=mOŰz5E'T[ċ3{}?o="s7Vh>EEgf>< ?LU+"M=+{Q,{{g6 g^vxe}5J0 |AIBy?L2‚'J 75u_w[k> %; ,S;3n{ /J4H`dža{tk>&½ -%}ۓHqtnB% Ma YR[X4Eʝ{YK'Qebk 4B2$S92/IFVnr_mUšyW0-C5jGƘRjCu֠S}H)7fhc7:â2J[g }2ZB|0{-C3l}{Є Н.ɮXu >!zw)M&HgмO{-tiAwRi[KV*k~ʈG!cq֤"xnMw{9<ZIΠHQ>x<]Pbq\|?e/qts)Fy5iFni]|Ou8N+T򨒤O0Y8OIQkL"y]Aj -1ɂ 泣bz^b90ri˳dZƅ?t;ſx$I@ZZ5\+guUzy^N#@Pر20f p)UЍ[L~C?]u|)Mdesdy@|^-+yPSD꽛^ِDsHc YLb5&ͤѹQO;-Oܦ`!6iU_4ҏl`U+\(Uq蹶wTm~ʸU:TpE V5) %EИw}ѤފAЖZͨSAFJT-KNR>GOy׆$sCt4lۀS=v]=h?)cmW3Wڥ:u&;+n t["UcVWW_dRSİB(hѰ f;e BgT V ZtO`7)Q,Vaw °P=RyS'-SG.\4&*'ocĨl_e|_e/GrVx< 빺-̯Y%Uv^zʳBUK}nQ58"rdDyu+4P dcC4t]K X WC ';lf^3:u[F݃6koЦP@ץБҾ'6nk|0K|B+Y#0.%YpkS:8bs 0 HTT)ztjABR==,|$VipBe2IUm}Uڎ=JUkWJ%jF 6R>{0N:b hJXċd*DIG!%` \!޻> {3O* ?;}nkck9CP˦ ֖w6ZyFbKh{KI< 2`{cو$ܧ3 rH/PJ9aFǘ)J&K(u S} 2;fÍl-?FFʗ{y#eb Q10.K )q8͗L[vJSDMEm*eA\1D޼~N\>a^~Di]_ajLbh'⪔J忂_o|d:7W ta \ EDnx7p T2 hK¿g^qgje'miCS~Ea8'lNءP' 5]ʓDZ*gtGhL]L6-jw߭i+$@ ha5'a}V KM;&w"SMt=P'+ăßY|ox^ m{ 5@IO{%ADc 7+mG0%Qm 昡;t`hH(θ>"OE]RU / :L9+O2Ҋt|qeOsYÎQ1 iV#@l%oS}[5JWD;.v=atHƑkʓi Y*PC\@Mnzs7}sM~88ޜr#F~,ci,?Qn9RQ')ݖɆw<7}@6[JP-Ddlsy#d!{?tֺ*"}˥'m|-ngręcf|dvg]t>D$]v.2M}ݗnc\M9IkMe8W؋R+V` TjBPFnY~E"}tPơ: =J6,%C,G[Xڌa1T CRQvjgTm1R\q3㬦j>B5dzKFhӇ*AW_#7Sl;LCaIލl{/* (?Ch9 VU_Z@9jPdj!Ht?5YkFuru?df4OLKE$XOhv9^)[WnD$~Fޝ'Yl#9-0d>&VKyzYi瓉LI>Ga|퀲d\18DKR8:dCD'9,pʚBcO6P=AY)Bin-k^q3k<(J]pnR 2T-Wa#J «mDa@b\{/ rZTm}v;<䪚L8j΄Nr|ɹz`$>Jy+h!('>s& T $gKmRoBKAMA3Ae4WoLaؚz#.X7&$)EMIF7E@UBZ,vSo;95fET B^q% ;2yA!CV9x_%m ~' CWPH;F#{ Vyr@iHH)~Gl;vR׬Mj]Ym!^J"aD.’7>eY u r4iE*TgFDe zu5(cu'>MԜ옄By4Һo&d6Lt?Ήڴs ]3WkgWbV k dW_G>cʶ.J Y!ɶDt"yGLNjQ*JAq1(YV{jfaV `>o_2܂ !YIۻPđ56jɷ /΋#^ҥYCO1tы^ YΣ4 3G <^^~P}H/x}@#ÑZ-ߕw4lx.\_-=C 94D̐ȇ]ncnYZ5/:s3V+zKGK'p_ņzBØRa20GB$S-cl઺mf*=vij~sPP& *f eͽaefqZ\k^Oi`Jw"i;G2ů#k@׫+z͡+*N,B_;1Lꧡ +,d.QgT#Ku†T&H,HzqEJI9RR |R4>2H+. C@LF\.XpXne25+9 ) Hi]fPr>WLO#W"/ښ_FƀVU6J0N=|9$$'`BȃI*+ kյgY#%@[М`N0 :$J+q;_4TbK/09)Di!LdLh-Lz&'3l7;] %q?Ͳg89l:" S:ׅ(Hw:`(tH drF BJKr]nxgIS݅d`xLJD}p7^ 1 f0S:&i,5|#V u#&G>U"y%w@(i@YuK+;;lvOUp4H>mM;._@_B\?2OԾYX Z؛B媍v&,!9&o|ݶ'Md6fy`J=õ?su:v{݇[f|;zb>Vm~\L8z^9Kkxqs , TXAMj1L@M%띣M'? Tk͟yKiH[M=pŰ]mB&H*SXjwC' ۯ}!׺Sd(pRŚ /)j[6^T1h/Y7p毿.*bH^F7h&-&^(b.ʅ\@NqyA:(6bKU? y)(mn; VM;_(ӂi9!N/i0JEatNj6?5")sipEϩV"-'ߤ) 씖HFYNq.p3X~lQ ZmIgiDix:ȰKkDx#dcÙ[0!@~uCUXo2Rڙ7?rᐬYEeG%F'˩FBZx.`dv.(>8s>fJق/9KYN P-UrPtE >Z+_"MLB3G VC IüW:q.e·f"K%35++(9 ej- 糐Y[5K7+ ߭0)$Ka3kR{*E޳q$W} ~p{9\v/u#=ሔ3Cڢئ8WWUW׃9?Juf 2nOAT<@nabbC.|'S!-QީH_e=kgtl-B!ՙpސd^"@Zj_"g&`2Dmjh.PH9~HхDS%jvds4dlD(%&jua<IΙ7_"R0qGtXiC+Do]ɯ}#ޢY+m#1_^GccrCj7M|/I>/|^& vquӠt0ʂ 5J4$9\P&n6#<[Wۮ]l"IV|?vh*0.SsrtVNSZ*nE@&a 5yY]!!9vq*DRFDS{*!~_l) 0!_(zԒ/G$d΅ |TR}nܲ<_5;2t%e1l)O5n-)[1HIYG*dRR|ġH2dY=2"S8eQ%ճ_?f]-S8ku0rt,Wyf<9zL ;` 03I 4Q"nl-o<EA aaO~ΌlRtbSkuTL2 !d"p&7tiHR9ZO&™K:v.Şx]?p# \9*Pv7+QAQ\ՑsƎ"s1'~]=kYWX0RY;OIDygU|T>1G4\6Tt7@Q:볕<)I_g8 z,a#Z 2`%!c'@Ho4$h+Q3VD[5&2)-悥ɇ y+Eڠ~,w)?4t4N:EJ&+<™INpc|)3άpnd\!$봊pIϮ]?=9 w1P n GضK&ى]u 6=8[TV]:HZ̘?QHI9gOMCQwܲ1Ϋ#%÷WXQ@[ _H#ll8`:F6.?~>Q-J4@Canvߺ5|p Bb3eBSX,U]seUKd´Zkq8X'RS6(NGrCf=ynN*<؅0'!.FjH6H\cu)F 7{ָV́Daa[%ƙT~X}wf욱q.pDʻΪFYVpQ:nș{5cHhEk8Hr䲟s4a *rJIGĨj28gTb(\0 0ù #D1R")֤ mÒA㳨ћc`k)X& 6(`5[ D0Eg*1H|SKeX3Έd6glzǕ"id:hPw8N%+:R g JHG$^UӼ/l-SN"ƌ"Dc@]#g, @e?Y |d&/(eew)U\18aI@Y»z旫Tf6p(s)'Ab0o“sEUe~D`AGły/{>:w$Ycn\T'MN 8c cr>No.^jɆ7/XVSlmM=&G58 p\y_* )!jjqeU8/j]*궹 #1܏_>lZA]OQ DtKeho !4PMMPr c'ӳc'cY4G5RӧyU {x<ƀgo>2~f'PEgsjV YSm/\9;]] ٳ= K2Pd)С,OmZ#&-P/31~!&~7T\&SqLeTl`W7(uP:PwA;*K戲Q#YR>GXRُ ZE_,nڒe;],-g4WECf]-oV횳YU<_{9eĜ#L`y{o~Dk2}@2MxrCcbpTMPJ^ REn&`JL9&b Lf矣rxcZ~$1\H3 Bt:ㅖ#TqenO4-;t_-\3a+*w ս~f!rQV|ZZ{|%U# c.{ָdu|uwV>+Atg 7P q3D};:CI3 Y*'`T^T N0|2 eˮoE͈i F(_Z uo:Px'0,%UE\P!}vPD9:f ^ zr FX*4 )ZHo%$tR&Gy&sJ"{gʑ:Hŕ@>s1ҹ( UubUec[AQOsp4LɢI Xf sTqYuH޴ aNw!O} +`4K{sxh/Tf-˼d0)C kߚ[ Mz$ F '(øŘjBr 4Y>ʖN( VBxF7:Z{hҮoתqk#޵q\З]jPܸA8i4ie^YҒWJܝ] H9sf{q//glY]Tl d7XjDŽG'\_' )w+q:AD־0^rLrUlqPsOD[~RK<Sw>KHbVzg}T֪ja*>J!Za-`fv bLʂ"Zxz KڅT,SDK%ɜΠPiK\•b1t={ 7ALyfKv<60XwS^e|ͣ*G-iW$F0) "Wi`BDpG?in;fijfi6`g3GQ#4 EG"% gGhwpLf[8b1A#-AGZfB9}w ܘ/g]SՇ+JcDn^80uT Moa'J6Drp95zx,\Ӿc0a֗;uJ!^`DF ɒ!*iJ`gӚxz~H@'%#$ = I02Nي*M`=Zk+ lShmW ommwdj*"Ye_%J൏KTS ${92tH硄{2=˔*LR-k q֤\%*+014)0z}%#XZjg}%R k.KҥBQ!ɞCl6^dn+AΊ>4" CH+j]؄ww&d1#VThY[k׎jO*65"et(0^Z--$>1-9&gzicUAV':,b+D^^+OXbUכy$4qjJeGz 1!b7tgnY㈯J*^z7y}1C7d[x;/km ӚR喀U#*-\k~R_STB2!U`' t?wZ+Q!C٭o7zn}M "mCM Q*{$%scufoN42w7܍((VZ˜/˥8QQy򙔈}%]$t@K# a2WW8Ř~[⹷jv/B*.2%fd.XJt'@ %Q>ۂb+GI&J#hPC麻Amb;/O71¡4UZ8"5g>8IziZѾ_KS,eshK@rR$bZ )Ƶ yVM*q\ƹ$%  T=C'5k8V=iRI~BaI~P WJY߽(%VuZ㒉Ƒ=uJ94l %w1߳]|);PIwpHD5JaT ,VJjDq0O7qlv?x&:\n7Fsnӯ'D#< ITG]"|-8HO~2 3xɂO3+?ڣHm."dld6pΌuc"V:p'2EKrk/tn|0W]X /fva4t51+wU}SQzz7̅o+TkQkQkQkQע~[>VLJƖ" #GlڛaqD ˌۼz?o`&o 8UIׅ&{O Cl6'D$WjduVrR ϴ; Er_⻨P'2 ox6@`mL0*@Fvz.m \c0jga|U=&i}'%bS]Ot@˗fV^d4dP=ԐuS"GФXPu8"bրUqdXE"VZXI58lM.MZQ2Qd pځ("rXWDzTmOЖ@/BMXnӋܬ.*gK 3,؁CXC7ݻv*֐X-γ%=$qo*"||eR l&v-v*${0KCIeCȱ/:\PL;  %Qч o8=6tkŇ|s_bBHQ$2KI?!ʙ`*oԇw2ݍV.'zoA`#M"{zH;dHa LZ0g*FONe-8F7GAC+X2*Jay.o^"W*j{hρ@Ѿ¿~ 5%; svH?fZKI {NIQg)%/zդ뜒h}dp>ZQ̽h2؋cy#&g 8SR}__R\mR)eW,5xRX+f֜01|9Ħ!C&Y (` ͽIg*䑵p6d} ?7?}O{3qfnr2,)gy7&^[S&[ jLj^̞~'Xny~ozIP6zA2*QlF$N KP aN]~rM߿OTl0yơ*RTdSWdW/ʰHL NJ!!K ?<B 4*k!GSAٛz+Ҙ3̧(撱j hHJQ$ i}biaA9R}GVɁ:N%/NRp$@0hq8izHUR8U`əȶq{E(MO tK z!~A JĎ q h}Qk=ќ /-P$cQi1xUGEj%[8Oǔ)0:TUQ+ $`{ӺŎL%Q4h5`{&Iwx9F۹Qsj9=k=8Ҁa Mx q. EB*ˁ^os+ h`0I.5&1faZ=/H-F~[ 3k;.@ڒS)3mK)ri}YJ(|G4F{Jp'WXd"6B?` E_ə;2DJGąucBԖ rᐳ.ݦ*ru$01ٯ!Q9;K}qDä.Sb\ͱŇL39TmQ!K!TSJ(uOI&Yo$\p/)ڭOhn>O@+,* M?ea3{ O3+?kވQ^b >BSB1ecO~N+ef'(Kpp;]q~TqӏUHaVT\b S cqD*j'E;L;,0GR0Z&yp$=ԍet3>;0!+D*PqqNj$RdD!ֆz ujHiGH/c,N(@FgaQ ﬏ZU-\!F[Bŀpo0d!" )L b8jA&!0r4SE#c@CTS,0j1v2J7?+n(X_&1͝zm_Mn]|75gh6AMty9sִFThz4y= t b`\p,9 G&A&ׄDj H] v6yXPGMxIׅ& O Cl'0$W+wEh,z3[]Lf&J2"0F|(9T z䁟4Q\Bt}|ͻ5k6kiP75q ESaD\ Z'P-=Ze=q {m=DTdP',)"ӸC_)R.ףK *3G&EpP㖍p(0y Ai=#$x.IAPNT Ez)^~C.CE¦ +Fq* R")wZ0(NDc=pp('D /Ep8?{Wȍ/=wҀ?fr"IǺؖ#ɓL[lv[j٤ZjɎd2[쪧d{='J.WcyFo(8E**Fcf;J)JCetShAiQarUth"VʊDKHb̓c&t؆UFZ""qT8ct .;0ͧc걆K\ orQjSA?2op"Z*!!.Q\ҽrtqȀփ&{G$ho% G֊Z[1ͥYk_Q0VJFc1 N(h+8L΂! 61@$WExl|[S@,q儮Pth+Gyj$xUEtѸ<%&h _A(Wk;*q4)5 MW4VP4`$}#y?,J_cJA-$ڕD MhzǨ t QJE!"z7x{VI\oo>_]oSY7#w?Y~΁/EqCW ]$'ZE~xjFֳzwtO+> *12+ޝ t6_*K$;th FN@Ov6xvp(*ɕ1thROơ[m%yʔXN 8='`c.m. 6DA Ӧ OApCd *HFB˥*Gi -PAFj*((ՔM0-P5Xж @; grRAh zW=$]kfXi*U%GQ0:eR`A4(yvLs-# Dc2&ZUQ-(Jbkc(IzB1ݷ\`RG CP+Uy*L2뚧q3l'j թ@r[bۦy#]4O z9'Kve?F.RȺHXC%bh%Fjtn˴t룤TZtL˴.;ld8g4J(`վ O)ig@ot}$F(On բtYJHW-T ڶ1ŕ*T:|J: 6*NCf "C"QPۖu-1 !"8FD6ہ N~]~DqPZR oC46J.=po}eUEKVڤmNp̡ubn&΄DhWѷqBՆFߜt]lR `6&Go @X .vTh%PoTZɳDʼn uCi EZY3.Eс՜"K5ږ1 c25PRsp Q0ǻLNҲ''!MNΒ\/q~si8q qt6n:Y=LRBl %u[ A);o{bPq=w.?^̿ xz꜏<8dh3jh A4ƸwG783v=Q#G/Ԣ ̈́6+|f=+z)ܗېd j;^4 e<޺ϾWe[u[Ƿ9@·"{=s{[db(Ig:Lz5n!oUkװn}:D ZDyZ< =xg U:j|LlЪ 38ƿ&!Cᙽ<=Ega~?n&;N*"1\WQ@+vZrx]Ir@R8KkYf_,FGfOg#FJd;YAPT G` J&?1K P͍hPTW2SZ Q V8tE!xJȄPI/s.0r=<3m/T IPNj3V"wlgN^QF^q|o'4p}vq|A%5 >[)۰q+߂я5tLӑ@ùIZOjShgeFd6D}o&AYgOwP;vji@;YXI5to$LZ YQO&]$ q$?iA |9 p~a&nqѬH:h%E^BRࡹA=(co3{}Ԛ@F`LepShB3)`\BD9Z( "%"Lk#ˍҺeqKʝkWάb$ bBb˴,6Cv{3Lo.꿧 |pL T|^էIpY;3/SV4nc~5\~X%@~n kُhӤ?hL'RTe5o'(K*^='yWlJĺXpS?XV]B(Yh )wi'4`VZ$~2֔ E*B>9BEàhdkAc_Gcvk8_7F5tOo:M8 ֓{j$r~(5G7n%?V% x%⢔v#Y|Hl jymɕo[Jl$5p,+W'Cj+%:E.#9F'=SC;T!)QS*[Dp(W*W!H+UDk$ c2S{Vwg4 CM^$k'5"Hu )*\itr$1csCdp0xȼj9%Ual)3|D%JyQS^i prA\HQ W%#\Lδ/ ڙΌoW 05;W}Ua鮣1M L&!*X@UKfD;K%4e%OPD$S/ 膻7hGڻI.;$Nť t (ܖ4B8 3_K:'M-QP *(UdI~Y .si2M/=ϦC|tAmË_zQ҄go<LDehNK 14w'7i%Xt+4B:\VpӓGD0L"/C̺+܆p\ۛh#Id>vNַWb>IsyV|im"5{) L?򣙬 ,qC-*(*H֣wE3Cy]kPsKcLO7m'#I xV2:/1h3}͆dLrmd3 ~ydVD7RIBxv(&VJI j&׊=L눮n#qIR"f[\LՏtP NYrL'tO+g89{/PjZ{fgxSY H2]\]D&"l^63Fa9gӕ ךlbvCVyK㲕Fù Qd!JD%NFu2-3 ^ڐgU2/ن|<Bdmw%5~[ `ܟ,jO{KsHZL_ك>[1;s=sRJ빿ʸXٹH^gB06>='_Vْ`I E %;nAb\ 3 y~h*MUT;XtZu S:y=?oQMUks^dI; b=E`ԞhC!Y_I*(p5QGWƵF+H/#E @"oZiKp&[|"[1דchsL=ٷ=yg2Hj=5/O7~Ύ/]d(^"Q#'Ž4/bD5.̩׸Dt52 -L!5OnoMS}X%Oƥ;_X¿PTvlׁdxlU}TOR)Q ="N"i4Q$۳"IfS,i9Ov„兝/81p艮{>Ƽ؞j F, MTIu! {iBF"߉\賷Wd޵>7n#/Ԇ2W8LM\$#ˊ$gI@!jK~FJsm"YNtz8wQ5vTm)lwv1*Z[.u}԰!FJ<[>UB-^qz?:׳T#wBѐT,iI&CeIԀ^CmOGݎHEfп/BCK^ e]ܯ}a3"𙦴Ih7`|M' xa(VHlш*~ #Jk&Q;KS3ܗ_ Ң! 9C-պN4p-<5%>Gg{f4K^7 Uc~Ԅlȝ)ϾEn#z6 C_WQ~_Fөv6swdYvs, ) z`[Thر`$Z ޝqp ][o 7v|N"r6[>r|YDܑzo,#3yM 0 f/y|" Iq:D[cm}NFԫ>ZV0Ӎ҇ @g}+T{ЅE)׍+A9mH,/Ǜh;s0N&-v.x3+.9 zJ9*HW tf# YL2 d;~|~gO5XA& mRvv7Z8Quw"p~IBΆ+F.&l? a.CRٰ3KOU"խ!^P1@V."ٔ+x!QS%ZS*wIYqBYQ2y fůhΆ! {\Xrm+C95PQkQAy>T|lUɺa!Ҝ^K5y*=' BAm*vcܖ%uV͚He7Wu%+[mEwv*^c3,Q9iV&]Vb՜)l6fI-W"jRE<_^Ar>Uc3%;ɼi5u/~XkMwc2m}BrVXSIWfЊ?;,YȖLU/ul\MrW'~!&Ьz% (yXF~=Һ.˽zK38aO`v;yKe1ּ^G$Ż։<ލ_bP|QB7Jw~S *^w/5 -R]{IVύ9hUa !F!мT0Ɂ+kXyQ4p[K;P^icp)9}m'dp"R=h\YaRpfr޹c;W_iC;)' N;ޅp<"wA{Q[CnFaO [}J-zVdYUr=TnS8l !dYJZ|..C=ܩ:?YzuPDuoU,Wد-\vB%U UqXӜVy,Y:Q\۪yl2,k.F. S풚Z#1J#BŸD''^tin!Ӥg]-QAn.UGqy^pOK!tO^@m4=_%ͱO/&h*{DՓNC zmL/=n6Lu%67wO]HmnwωPTTv5R=JRDw\P;TU/!mFhKB! w]ΜN"{+:'=s~GDcZwuuʊ𫶖Z鎗S}#N]MjaWЖ uT_dl;Yowg\r0=sW(d rE&(hʃJpo\]' 0Ĭv=<$ꬉƯسXIұ47SRp~׬{ً8"HRԍD.k:$L\Qk\=d}r>|nk!\\zS O.&/ |>ZD (citcIP$DU,:Vل)˭p z6p\Go7mFWzϦ; ]=dqW.5=]GNZ/Vޗk9x=ւTHb)tN 5iRAH GHsPVDRQ:is 7Rz޸yY"Ib-Wbz:XW ː_!%jkr~@3%w3}gM%ɶ HGbELJv7+SMʤ!e+&4hq4olO Exȱ@R~r;P4i2 GA{_c*@CP!)C~PRLsVsvB/gFMG]dna֬Id'xYOOO؋<=t\1Şdqx_3?,Ov~b1D/a檋[7%O\O`lDN>A,Rx}xUdsbܸ;o8=EքaL8@g,TĔP@0![&Ky58򪳟a LAM ~p 1s5]G>Rk53JP C9(év!+9%MW 0#YSnxXlSݿEPI.7#%"W9T^ePa_oaٔ0Ajy-u+{mҋ˭}rt~́bPL<[hk8>_Ǘ[ȘxL2"c͞d4iTrLk>-%0>G&NI}xpPsjKN1 PJl LFuA?-?mAF>RM@۔$N#V3 8c&FǑ"6,$ J4 Ih>WZ׷XݽG/@ߏvBW`9zl1@EU L d1[S%eQP!N0"ps#ROltйkRl5~<  SaWQE4eTr0] AŸ톸pXbZC>ӔK8Պc)]hP\p KaM$Upl(u8T3 (.$E= T1R\jiGw]L·Ɖ GR80p$GrmHfB$qS!IbGm0SC]6C`md[KɎڛgfY4x>G0 8'{;)x_ϓ;.ru8Vlh-bl#85DXDI$:& U"F2S$T8@D%`+OjBcwwoSm,A3x'e.eaKHh$ F0(0݀4"["38E:60늅$[e!C; ԲG cD@$e1F +!b*ΰ EF(a'7#l16ϣwwFr97xj\x+#%ehPPx4<ֲ`"g׫S?cJ;z(tާM7=rZ$%~?_ҽONnԔ]Qv:}:_[۟pkt]zFZs\2oIp$ N>cWSJy b7>;[iK_oN6^ͽyn9ԔRjJBeo3ץ 3ip!>{fL:SLJF +Ao *Ca@~Ԫ˃AsosUSZuVv, ~vhUMxΖE2"xeʞx! SL4_?:, %KI%c*06N6,x:|\}G=[!SG$D|#=? Jꪃ=ǡλ`~?.9S_;`]'իhڟ_.tuLJn$<͟a0Qc[EƉ'7y1]5`fbQ'f٘Tp@ #Im&,WlשdY#T?h~hDk6ώ$7-U}e%+,Bm -w[(gK4!/9-[0*ջ_'J&H<&(h G\ KKE(M c,I r>MP(%)qL?Lo"dboܒ;F5L9gAVO.2旟?"C8QXYM߯wV!f_:sM"BGbCv1ܱ8 O5J:m~]cL)2|bq{ 9Ԃb@;[[a2,ʏn՞R|otFNFey, +C3`Oj4)xu7}b$xL+yp͒ c">Pc'Wj]6# eŽ1;;Ki q'ٳ:uT6 *~ $>U>ni%j̠[n ]DVӔ(>vuqe7A M0܂ߜVcG1n8st%PASILAʢZT $JcXhoϚ4ZyC-#`:;bnyQ/coUzu/t6Hk|˞&PzК0yy1| àj4uҡ9=RE9;; Z3mpr6R~A'lc B6 '!/E1h'v+ma`yKϠyaz8)>PMrD~,g*8DWgt"ʁ"y ɍ1Z$mсzm`rģF{&#',8=>f4uljWy-[}3$vu; -6wKXƱӤi@j+!=[68fhF | Zr;vrHdކ3i\ޞ Ye8fA 6mvO뗭UuB}'~dd۳߿9E-9@#9Ӡh8Eg49piᢈ mU~jf ('Jn@DDMz/eL9(AD% q`cކP.1=T.YhC5k"UxNuI5ڱwswڳV2[l/V~w$₡3>%o|EA M QHq)8?:JZ|wZ׆:ݝֵ5(; Nۥǫ\ɔ_29lj ό]ቈR/MO %&sOMFMIm$ @, @-keQzwͳkVA" YVQ|ByPՄ:. Z;TGAutEˉkm)$<:dA$ <8u?'ZE)AURB  CAx4X$mwdϒ %xcJX&m|kHe t(8M2)+$ -Db!$Oi*ĝV,XT6_X{DDVg|]N%;Sx!r!4Eq!c?"bH{ M Ɔ %sk |+3a O#] $EEOGeJ`@z8l1T~33ΡpY6^1"V'h)οIhD%P키A=gH `+HbT8G Y٪aἯBǰ|c3S104 {& V:&LrF@F%`N ȐxR(ivkՉ ;"PQg-ihYHn1pz G q4#\ݾ67 |ݛ|YսdP%@'$O2>w8~RhI&6fQ3ZE˗.m~ !SPQAE‚n:hIV:51 Us6#9ZJ(PZK֠t FLg\MSW=N.J:V]D]Eb[@lp &N A%itt@R+dRj4iBp1XknV.>Rl`( QNpv U ceH7hC' vFV5.#b8;%5c6 g:AGeGfB= Fd8eE2 8/hb)bb6"QZ<3(d4O0#48G# ĒEZ׋ڧ@N) >odV&`-(1(msQXJ䂥Zq%/Zyݾ [}K3{a?*lw6@ կUei'-ʇ:$_ұ|E[Ep!Yן+Q -Ww_EZl: d5DGu(ݸTeFU*ZWA VS XflՖ6wYK7.i1xh 6Gq$CGrf/nLk-}y`40 ٛo$'?eˬY?/YO4!Mbڱ@0m}%I&!69K<=&+A96)ѸkkA N"Jj zuf,vEqua8҄V͵_8e *V#P24^_BD2FDhG=+(+xs@D MҖS||? Z8NUxu=BaB 3f0,X8\Y;f*LeV{׬߶h}}v"y->%aʹ>!^hx=_&v3 m72-(۵X`u}+!.W`ޝԒO<61U->:Wpr Œӻ DVSG^|ː}|9]/ëKt]ISQ1#hyIg %wmxe8BK#pWd*Dcj7OB_ ]d^UTG9:*i4_]<5}1(ySkE W6(&L0ЃGY~_Sg-5\r9+SJE*{t$8dKO+# $&RBDHbRf\W#4^yz۳O36s]psc#kNd޻1OQj/.l <~xn2g <'XI$Y!s-Fb>P4QY.kiWofik>Ӡ2!Ou~Zې, 2&;tu[AJw_"x ;49rV<"vyR\hA1v V)roj e-Kt(UBZ%H=" mN 5~߻ڈn>iP63䔊mIz zw 2 o(I 撔m5Jw)D\[#}O"jmCp5fm=QO>m{=mۣ[{/޽ђUG&lӞl% l Dr`Ug+kNZ!O6 7Ⱝm>9:"ECCQxXc([\.Hn{$$r"DR*tܜ\ssOf=`4yܜaiQW.8(+xtg\6(,*6$ uԌQ9.m•ǝCdruK+'E.2"D,yF48k&wH@4G05%o95frU[Z|UUnS阯ZƽZd䫖w8>^C"RKD*TRp0&pHS*FAS)^./xY; &19zSBjjTmmQ6.1sZѐ><\Źft7)ꁂ9n!ֵP:\@/4_/ƧASEz2{(Uy>kvA,[N')r+eF$R:k^wtNr"ܑ*Ө Wɡ3KD@E6D;Y˕)iˀ#]}}Rt\6NoR/GyyT/_y"vX1#C; C\r_Ou<%xFeݙ- x)@-;1k{=;טዔ*[BA7X^)jl̥%` |,:)N}*&+H # %7MH `y-4Xhk8N֓o-!3&:c6&Xa'\mqde)e)gI:F{ 2&c Jѽa[B`F:clLJ5dprRGbŇB]A g*,=gJ쥆[O.쟎=^痎>׆$͌6㝶\>=:btڲw6첩JZ:É>>j} {M :-)\f9_5O^nν9߼p ӉwN{nܸyxo)A6QnF[a>\[‚QuS5s|U쎻3m:?7H`\Pl\k uhynబTlbq 3/ogGC7@ZvT>Vr)+%CV4]ϟ{$%@(Uҹ-kRL =xzk_--TnqEp+`lǼ-*كdmu'}u e95tt8~'^ " N&76IhOZ4ֳyȎIu8- lU<'J6CwƟU\37UgfGU ؽʂ<w<*AX*Yv{Wxlѐ=%8yTJ>PW[Qd-63ـ~%}`mj*5JŨ2<B`e6g[9̭ :Ԁ)h\VSt+r䏶aQknΧ}Z%>$Tf|wINƗ6>82YIC麳Ί;+#ۋE {U)lXHP-Ti-&†G?Su^']ctTߍ!Sjg#MovoT15ۮ/KVjǜXWaف>hK(vdd{ɉ5 $xtGߜU0kX.}[}xIpQah˜Ayp}޿}~ !#GH΂xFw'dKo&S%UCP\jۣoȹVcNI3Mfo>1sZkaq<`@-puEcppJʁ W}}_?ޕjESF-teΤJHC~Vj43l ْ"1/I[ikDj! +$RqD$R"_%EFL|0x|rb3&*c%`'fG>^_uNglE6ԗmH3)^>+<77Z4_Nbp~5ĚSy;u9jbQIuqbnP6E`r'0)A0;~~ 1]Bb6[{c:X|9#)A>j0E;C˟?^€G' Fη`hѧˋ~;I^ :d 2ӯ] /I͗Qg9wS|m}I'g3wyPMS-8졠nm }IP) ' 4hmHt4 s:׌m >]?SEqԃ~& ~Q^E,Id.| EF2 r jx6c׻Xfakj9ݮ;`9Xq 6]^n>{0zN]Pf@<.bmlwW,»/#^wg=@hөMu8MN{`6=CeХjCեP t󘣛Al@rH >cHT[AZ WېF 1vі}Oo 95kqhfL0Ff-qHAX]T(jD ҁcdZ]\5'd*kacZ:#)qӧB=;*$ϭIqM\xk380~ ̞1i`TzG{I-&G&LB!@@ oh'!IFge9GE]A BHF2q3V X^rVR~ mOZ^|t'<}H?QJ) &yBfߞU2fկuP[5>}ʛψOwVԼ ĊȜ+ZNiھp(/ M&Vcm !PoduĖ49WGV,,,dsU:+d{Eh%[\:΃\d$@35v^&d7Lhyydd=w7A)x*: Ț;YI X U%fjZZ3LҘ(Msb":RYɞB469nzay>E)40["R$_r(BQ m!<,p&kAq mΛn"2mi)_.&]@o=0@EW!)k:epi-ȩQZj)= ѲtF^^OȲ~`$n˺ T4$!)y&/))N%KĔ1.z4!c=x Q }BlX8T0RUV0n[2:1rV+[^ V5xHnxcIZ#u~I`z)#r.3ޥAXY+5bLu*NQk).#41(#IBg1R"XI;&H` Oy1g&,I bBE^Cjw:!J w jR{ I 1LSQ{ c)q&%vLQ\1Yk5 `rR\"elʲ>!%3R, s ٧»rPː=M1 /Q$S{T#W]N}>ـ` ڵ ӳ+* j;u~-f6XEr3uԟF!qɴhRąe_G.ޭ>}CLD4:o,g9i$&JfuXڹVͿkq7?=ȓv^7KW7"\ۺ'ի@m?5ڢ R1^]NwjqaxŇouyhq\cJU rSbOc96V3Y.5U#^#ʡ"y"hLb+a@j N=H?sB5Y*%D%K%e!HZ =d%Ⱦֈ%M K턑K&!lg47%pF)b)t7)/ 7>2MtlEsyg~u]jq!yW9t pmEJm]On'oOD?O'7o81Sy9(^$܌і>H}9΍d_LC"2oOft)ȞbM;?Jy6( 3[lr"1-9*ݩk@!骑/SΜ5 c̚(_Tǻy?ʓTc^nӘ>^ I]|X˗bv/vxt?|THJG[N3hUF:IX.:j:-Gf:=%Mpڽy2 Ďyo(DVTCPˍvWYUuu Ɋܛ:-uVYRgeKuT+w\"GAe6B@MB,`4F^h:}'\ Qy 嘃QwUxB*pnmBfYG_ڣR6圣`LX+I3A6٬R`Wd_샤'D=% XĘL?{O8~Ph^f1ݭB/ dlIvI̿o$%Y,YI&Ir$qddH Z`U 5Wӂ3:X[K$ƛqYh;9qO2(UpwcTfl|p9ˍ +GI}r v3s2c+d\/`X4GhP9 KUs)~ 8)A`@B5zCKP,ٱ?$Im\wKCcn̷Xd-!Юl qÅ5 o~%RW[V͊5zhmn=42v `x??{"A۲V]ڵ@e)Ńj?N4ӧ&3 O o ݄ ~OA#oYji+?H]c\r{GpbS.l@E)_ _3斧n`i^+egx 7֤\x[?86)R$Hߦ S4.ɰ~NbsɡhBp70<峟n?OwG;@"`ý-ΙȔO/gi(Qy8t}>.G3;rm5dsRBb:HYnKH N?|At~{rDWUg˒=et#)mYFnYy>o],x E`.ZvvwU8v4ywvǣ|yBO/b1#ɐ)Ve'9[CM:M"2Q!Rt >#ϕ(uJ˄TgpyPhΥT*]oʮwnu{do0lX{gᓍ0^ˑJ b&qf?~|t9c>ʰ*U(g;&J +FuYS$!j'eclêRuz*sknV#[-yqq9O8ޘ,>km:hxo$޲BLXcaeƨcYnʴ6HDl+7ġVqA,ۋb !{e?Ѩr]粟hӌ6\r%7=64C}}}}Q%w5Z# QN KuQnaIT-PAQVLԅ2Soaj<#7K0Lim( WpB&T 8-eĎTRmj:& J[ɽC̋!U?Ǜu(d u>W*i Bx0׷?(p!-Mfv4;E m=V ̄#BlʕҖʊN ţ/ xd:rGy'o"ݣgZ>ѫH[%-'ɮ͞r_4ĺy'!|<*:IIjk;zj1})c HʬE"$CeD"1 ^Mכ1"p .ǠɥPU6*3K%7#((3DigDXjI|,&uXL9Re']oR9~3X#=0ibs,yĊaa,p$@2p8;+ ~,jQ[׉T~jj: JRJzTj;JzUNaF-Q{% ZqN)BFjBٗ(W^ }6љGզQԩښ*#ʵ|Rkko۷30j]$S@ %r8a5 lQ8n4m^V>;ad7g5iV]HH5Ei8Yq3\#̅89O/eqrñɋmR\n4zqrNq2#C۽jjF2` 䑄[A(@Ts_Bq ".0+v{^ް`^N̸"odk=%҇[(Tmt2M/i>me9$Ś[<(RIJI+MbK2{7g~SCʆEPaB+~:õg !@oLA֨3y 1Vmv Q%ّڈ ˯iݰHg.}}4C$Mϩ\G;ONh˩OQF wznh:ѩ死zDuH吇j=A%fziKJRs+Y,qƄ%΅`}%cްG2a7A6E"< Pow:82!¸JZL op $"!wR{ ה4WLc,q<n{בILo5cy*F}&LD4Mo9bYAL &EW83s0@dsaK1Ae{XE L^RD^D͎H]57>2!e^pp#==fd&cgC[PϘ hE.hyiPu^^=C&>>&oT",()8cb`:-r~!TFuzUH"ر|z`@V/a; pVTB煒EG@{&&w!"2DE?~lGh}|P;\^~{0yL29R8mY5}s%$ѡvr} 9AR+~& ??Lخkȳd\`l!K)ѻR-k)(|"b,D0LM<C,;; =֣"3~ƋC($Ш65o#Rڌ|TY딘[йn6'썬+$XSʺ dʉXЂ!ÉUTЫ@l񟶻]%hZo>`iShI.REQym|3,3QjF̱c ;*SY*B6HgJqnMAd<+έ+mz&Z&& ʠZ).FQ *.Q>GZ4d*)S* V; b)#CB37Luΐ\o9c_ZNF˂r R+sVT;`t䤢0~kzPϊɧ;şY&a'mǛPH^ֵKt́SDPm2D$'07z"Z1c5>S ˁ/c5MSD ϚݾMe NPBl]y\63Le߾ir[vz6E7DŽupt"j:HGcM<_c#I/An_ABٗ'C&2X<_ތS1QsN*+(S1E!ryJ\cJD#ȥ:T@bEHo$9`{@`4cPPg ] sBٞ5V$n&+D"2j%՗x2;#22y ɈE l+2W+$`RWѷۗ%ڊ+fk4E-U:[t49p : " Z+w& Vu*u*)[Glf~", `kDvlOg}fgwfgW\deJ@f DNh=_52*eOu='3s3ɑ艞'%TEӓҩ$עƑD"qj(23QƈsR!cMgyhl5pI,'ə9FeL"-\v\rYx.x*Ƭך{9'n; }'t(#`{FQHW"nU1;8@^\bUQ)>)+2옎'ny:IVUUB-zLzu1rs-hp7 +hv;0J3L|_SxjFk1}"8Y{ CWټt€hyV N@ITs2❅[lյU_Vͽ\[}]r?9g:wz|'OL@I'=č{ ^j8xuU +i"u'T ĭ7Tח/Cżh6zN(Vf5NQ$vDGעC)YTq+Kkm4tܚ'>ke@Lo]!תdG铨QȴLhO3f%L#$[g5D,FlEܯ$f\9;u-9>bϗ:LRdɈcAizTp\ Yc}\kQU9bOk*'}e'1=$*:Y{nV^avޏ-n UVU4dhA@!vqb c9ssN@C$ҪC9wvfN d`ss~A!l925N @Tsz<@T(ɺcƌ$.oDSH>?33|". `9O@JG4ty8:q+@ ̓byhZ1D>#WFϐ*cRLcB3d%V AR#,#J +,P*ZHxhB>g5>Ĩ< ͨHı(gK6 [JJ@MYOQu;nE̸E*£xe3&P.pvW2JU^a^Wu[GE%mpEz-0W1>yxBT٧%J&b򉈉*%"$*wYX5XXp+G`0V}9HfU*\&Sj$kT2NTb`MHgg|$cIb+ց ӅUWewz\gkM 3Ǣ91lf™4x nMWflS&uYW\mֹb.Hd=!Wbڛ>:Ӕ"ݲΒ:5I;,bdhRMׄ+Ɗ%LOaA1 gf.y8E&2(rb"K)SItG0m1UEJ\*eYg'U+iX.<3،+sNnxIETy*\ "k,ip'5Z^f @1`;KÜ㒼u-DvHwaDatmqzYԇd0.;$4!'$47=c\eWX({ĉdk準@&R<7W1y*I8|'ͷƄb׾]tn!Ü3܂[uِu`~0gI2g[K@;%eZoqCyKLimOFl(WƓµLP]J Z^ּj@2.W=Q)DFE\K 2 cuOgdzzY8DZmu&a1kUg+ȸ yKJ Ptk?ZjuTn`.Uy*o)wh1+ږOqU`Ȩ$Gf#p\ʕ/(C35}Y3ME4q͂3 qg_N?&Pv `|NKJ >}nX\]#* q .d7 oxßl|;z{hS?A#x[,?{!CL|.C}% 3L7d0Μ Ql k88< n,xV "0ՠ>Dhv зޅBmDԛO6؝t!1Hrр`$17f4`v1p_R-y4ʣ 7H>}^`c. ؍*Wdt1̚`-HtH׾ #4wA@Y*h־7W?B) rcP VܺڲM=iݣy|Q!D~ų93 Ċi30t"s gϢO~qi^Hi \v.D!IED#;2ā!`Ͼ-]f)[߅\kS{Q#"\eU-r>B1cZ)SGcsM`|0M*]Y!r p0P01YSl2)'Jai`rA&4/S@;!Lrd7;ĂAbaX3&S`xM^ b"ߦ.{R$L"DonN p\7WrtQ*L; ޛ~>cZ{k-AL{՚RNi%Z R - D+AƂl& y+1(t[Ul)Q y޲G%%+g,YyTM{G"A0UT$G#7+L"8`0)2m(cQ'v\%KR`,aΩ)jrx"pԃÜcZRAؠtQ4W6pdRˑi' f6/<ϨdXp$(sF Sb%y-266|i.Lu4>1 ,h׵S:i,zS}쉽e gzr0G2l_z ) ߸L¿qoƓO/x2>^!"8=:fr>W??{0.>6',pu/|螽l ofaWkw~O0é?`/v/=|׷.?">=3s}`:;Gǽs=qVx w_<Ҙlo: A o(g\;̋ƹq9Lq<^u؋_(?_;ލ_{ɱ,cfOg`0|[DNNG;6w&#_Hb_`#k2/NʛDT;/*ǤW>8Zq﫽sG>U ꝣ?mOg]{3+`{/vvc?Y1w^h0K?4_w=7@ gWg6w<8aXnHwy<ק^ԝO7s mO*b koۯvA_`P)WײxB';cp^c~##=}/̑COt͗=w/?‹L'`{wKo/ S8x|'px|K>'~'MUy!Jӫ /.o'0lPmiMX{a0us6wg0eҽnd/@;RwmH+? fK-oEthlvҽa3xvlvN߷xdDz$nd1Ht*>u(@]kFΨ ntGed/706?{7w^kuky|X |\5D ɪL*t~Wۨ6-]ɰeğQsqW|c<3BҒ!hF.Yh}]^DYPt"%3Q˦'-xM 6-ش`ӂM R yf ,'lx.J0o!x2Sڌ o0P|"C㿓 OcQHZ N_tj]qwnK_L9˷%z%.{&οchw^z fcKC)+DǸ S*h+D !$7B"c!owB^LK ZFM 5fɾ&s KAQ*0Wn(r𘔷N&Ʉ(q%)ҥd '#XƾCj-E,p7{Y^fV2̽4ᒻ# NT%Id$Y2&:"K%ee&`|}VK2S 蘨d u4Zg_5QTM>H*THl3ɽ@j"F )&7u.u6o,TJ}9 0!Hc1KG . b1UY=w']lv3!?X#2-8\i(q׺.dDA!X knyM; -=k[Ed i0CS%8\uErNeܤc%h\s~6=1TR *XPS lCҽK<*P1C  6'q[=# i64 ZӍ;Şcs])v:O+v*Lpqԓ7tDm;]aWe[-5?.)hEpCfDQ$gQtk sm v=Mucm+zR r{T@u C 3]c077ЪGD(=@ ^+Zm7yRZrJ@u{v$Q9_¾%*LEM];;, 2+)RLve*)@5jw3Y[oJ $a fH/WF ُF0U)lKD@muKUnWh>ڱCyZorʁNMU*Y;mO+@ulhkʅz[;e{JMeEc%+>\덒F]-dwUˮ^[zyl OcI2xvĮr|;D+vU^w0p5:;ӫy@+E9Ӕ\j ݮ2{R%%z:eL(oEYYn4=4[iVn[iՕR(3K.&cQ"65 j(huHәԧ!hhAӇ*0Pb jUp%,U ^- $H;/1kڍ6,e5ؐŤC,XlB^f.^K0 -H`:-202-2-2+`L)Yb8hfwdfCL0JHAїK $Qz~ ">A O 6 nwJaܣ/g3P.[\ ʡ)I#P10dA']q6'•oߚ/p6mwu;)PfI.{ hOg$G$cFy Li-sc$y#Xk>.[ Hr w]A}|ʞΕC"tv&.}!*͋GJU0 QtPTe㓲{6;Py# @An}OG6m AnBDMSL)Ꙙ[i,(5Hᥒ6msR8~|?vvg/#'p$ AnD;v.O',Yp-%dz|R_t"TRQ$4qaKM )]m6+=x s6m0wu;OfA:v.d$j̺ "JʢL1N0.?=`?}=jh5i] =] Cep;h{|qvz\wRARZL:XڲYy)bO X<>z3Or@5٦mC}BOSCۆ m6]N蓙28Dt:oG3%Ob't#1$aJ #dv~OgOzə uݡ,7J«! i2 s%Z Z#DFkRV,hn8~4igu3)Pf akML(^CI@ Y[6312zU#6H@sGιO@x{q&\,@:gu d*blRE)g'7KOڎ:5LF?{u[66i*Ԍ8o߽/5֬b8iaHGH$`Cw}@juʷ^{i֡MnE}nփ\9J}i惾};=KKr[:8=4e1* H3!b L+#hڷV8V9jlYzmG%uZjrP_ǶZZj9qq2K!,\[rN!2` c`jhF>rjNOO/n~c wjYvA}T,;.PРwb/?䡕)Xf7:iNQ ]Ĥ3( ~.,J)l " ,e?VpzLH^Fdn!ɾy9sn g7γwE;u)i,_f"BQ;ȳ頁y+#@h liяJ=8g~Pg/blGRz\ȎUc;Ch{zqWZY<Q,R**VABG:z.H(eY-LZ,>mCmv! 7IF#wpi͝0aŜqIWHD#ڹt#m`n>^$b),9^x1㺵gUq%9{O> t1Jt줸"EK*\!y,DLE4^{u˥L moٓmn9}4bvӋ(W=s40x:g'Hv/g쿗p$||Ë_ou 4zõ0q>Iٟ~ o ջǯ:j?O7o>nT}U/I0׏߻1?W=\@}!P.u%";\qZI{dHjQKJۋ7q|}w' ,\\ٻ#xCF;u?t$g=p6Xw!CQϪGamw6ޞuNYHxCiuH~}eW5#!4dj^|qE/4us=;,1D@,e,9!wYlyr/:EX0Δ|M}W/G{;nR-=}śCغWGѫQcnW.k2枵G 1VIGe8~5ŌA %G< ^NiT&bejd^q>f`ngճ!W ,@i# f@ Sv 3rڕ?66r,1g/58%67?4-8! Xh3کcUb[NUWMߝ~?<2qנ_?PΘ@9nk9!1G1^ ~Yy~xqKb! .$ Fodқ*Tʤ72e1"hGK4Ɔgj{H_!%1YTwWWW '%0`8A>BLKDkR3;I`[G$.g骧j>/#<(.Auin9勆.6*Ћ /{Xz=O7k^ܼ\tꋟ r$֜ɍhu+z=,Gk[Ѻ!CC J} { <,Ϲl3\lc~翁 ]@1by?v7.cǝKֺx [{[~$7X.zs0v}._\!R6ђ J.SN˟n^*$2![jnlYGŻZ[_ɧZlH΍$ פFAF`=_ix_ˋ_^~?Ļ[|y^!"r:r+l'DU"onz+ 73X97IL87xHvS5 1Eqb"cU 7Q9?nNR9`@ q13''?3kó5pOR_)Klo$Z 3̺!]aw$ )FHD[kfrYbSKԢPhqѓ a49:jߓ'i{T^,y;|} p~<LV;sigkō1, F+J)Δ9LF0k;V ȩuasO0}RҔ&ķp>xdu@yXbZS9ijK`xpobNF}:J Xdm-=?lN |yתNK`6B \ znW7s U^:ۡ!?}Qi!)*KV?/DN;;@;Ғ'Bt^Pzˋvf$c=l[ew_[1Xry.]Շ7ORDi<\zf}X 6^ׄp=ZOv꺹^ޞ}3;N.`9]t+v5yRB$Y(SPãr RmV-2 6 d\,'l& Yv$^j#Y*DB%S ;C6|n29A+dJ|.}{5C C4cّ91<#M#K9\TגEBh).mNr $h?m`fӗC{n $N1u*x*spmCIqԪJa)Ҹ'٥3x1$DJkp#QI>]*un6xbzh 'YA٠yݲ[N_de; frhAnl0Do>qDQ{N- (ӁsTح-2yavZNm: 5<׺d9ųG^qLǑW؅E^;@%!H^ĶÅD2KgN-ˇ@kZAy\v֚OtsJ@![}([K7RİXD+kBb\D-=#iji!sKe` nbPޫwj`=\=:k♠+dOqi;Gpy+ɳRWSxgEI@e#(9q My8Op7łp@Sb5nSA3@q5 >x z8"%kXLA',p=5{ <\I9lw\ٞx:@L zgY=8מ=?\{t `?Rq!V1E8{GltTL. u|B)3eg oﻸ^.WUzaWvdFa1>\ yYqHsT}C#K<+{Vue|tǙF5dQBt6e(6%s瘒sV_qx-EjAr1UM6[{}Li+` ?,1%[͸RF6( xNr*B:KSLTtP?/RQjыL/g&'B~/]$ڸ03$91"B,̄8g1o>~Is (/^Ç$)HX=1GRgFM^AXɁz\RjuL^ZzmJN-\!Z4-/n'i{Tgmzp yY9sma{@)Q;ؑ{OigC0T^ ﮳.yP 9U&f+l:m 4ޱl\+vJ8ħ/]Z/q]%Ilm WfFq\B6o&2LSzMX N#SB]^mjtk ԢP~/)^6~ ~UO?%Ck>CyHi6$āf5Ġ?VHs@ІЭ–5A%k0 0hŪ>]`m ў=9LHΜO3T/s^/|pc56*q;[MAk5yb=J},̢N *1*e2Q)p:{CM觤br-5è&kqAM|oYypR {#OwlEAyީEhYH& *O#2<Lc۠19P>}c,Pb4p ǭYw<.eEz$ u).?wrQXJV |{q:+ +z7s|w;|FrBRGlg%%gwd7abYӋJO'F3XNTHA㠄X?<ջJ:`=" Vd&AzAPbcIq/. AyީEAndžف:cl٬So6&/ `n$3̅[ >t#ўeT"|5yp#f­@.~63_ jY~x:r˹gz8JpC [ 14䨠,VYS;Cu)h/exR5[}3x 0v|D{.㼅>YH]b OV(r&Elljw@zH Mh~H5aسZN-?H/Ɨح보-Nc9LW aZɶ_ynTn4gC !{KEZ0D#5Ռo^FFnG6@ Tʣ.)Ig.i8~z"za'zL'E Bh/rA߿ lqsaaq7y8?xptX mV:4 یbVdkHp 5?^2b^ܹbx6~3۰ZɶAN{#6KgF\]a$.xv~0y`yoJĚlg` Fb2L}ܛ0 *ϫ?<:0߉) n=QQM/,aaVQLlʼWV_Ef}:Z *zj2BfŤꢸ H9+h雡 8>?(23(._!kʺ^aw0뉡!C$:1BZ[AG9g?oF@@$?̭t9\kQ}"Ad`e+-\Kb"{Օfs`~w{&sU4rٿ i\@ތ/G_V-0{;gj>!)x}XGෛgVr_~|rwRxשt}:+# T <V /!هWHpx(-Rir"Ahq.mqߤL>pt:W8]$%I4~),e47۟u=>~+׳*Ƶ iOpx:s5"TysMa<}pGg{`g5  O3U9E G 4.f_<"`p󄋧w/m/ax,TOj܅ܚ ̈́m^[}^f&.șxQ`8H[ބ' w*ZAO™u`LWv^ evԛ޿ p&*jQ :]_t/eɸ@Zp&j4B3/YM獻Eny@@Ff+uYr_I<.lLu,`&s9k1'R.e2Szq{"*<֠.\eP;rִ͹_1<3<ƍNjIm,oMǤryr6E]XhȗLJ=8( +՛i*&5gJTdҙpe: %-7bi8譜hLF(QMR/z# S3Ř58p6m |=.Mr,㎴IKJ< %Z=[ uIIKTw"޳=]Nafwz)(Xղ8^p%M(O:xZۉ)sKb696SlaXb)#Y*pFgJY)[]ա}NV)rn{,ƓҒt?rI%9s `.Hx[1BZC+?RGd<`Hq ,_㌶2QWuD$ jd FQںI ȾL%jڪq2赭}u*V1 c9%ؾ0E+1 J*mH@32:c μ=cdɎyV;eI(ltQZ8Vh)щ.O wUGA^KTH&|J*A+BJ~{ rt~/#[sqZ%HjNa[a-Kȁ@B6#IeEZ?/Maew&p+XK.kVgNYHhV`n?KX$^̳KvL@/T}ǫ(άy3Jbf+`A4tG$+BT;D'&4m?I >w<$ Ųj'>kHn0B-6TJtōZ2%o *Gʣ )4lҜ⢰QBĻ bssj<I͌\EZZ|mn;DuφqjDv4:wUo%jg?'36&-79uH6x=Xn6֭(V|an]\P>2KLZSޑ u,%orJ7^t[;`o_{*We{{b N "b +Xp\<ϥE+⭝/wD͗)O&F]llCć 9CPVB 7 ٥i'ChC e{ƝlyǡG=Se?LX yk2(l9EdQL[-p8yd.%* dZ dͺ w֌ƍ}1cDXC IXKGyiO[usH>y} t[lÊ:>wFZkpmiGW#Yu ~ _$70%=*sc q޾ufmWEӲo7-76$:yKh 5ږ#PgP#8v⿪_[3nwīӥ ̼`ޕW)<f/۫ןv`ͽɇ7_t0 9)>??U񰖅CW&~r?Dp w/f_<'.$Rc x9`X9+@vB0sji"7pօX rXi#V궘+o샹y 8F]z6 Spukn{&*7ɯhD$Wǯ@yH[*b*IjWO?Z$sz3UHI4#V?9[ MY `.P[p#2Աŀk0% I CDw $HI֞4)M_UCe[O.!|rC9ּ0[!bY`r ((DS$N1= e` ]"3trcvdfNa]4]&S Ł  c/UCJmIt@S]XnG(8[)0P/2Be.HQ ycqOH+SeB1uZkɺ!.WL$7tA%S=ވ `J<D8ϘW8SZ̰\K"9yj`-CXejf}i@Dɝ?yur'8rJzGQbe㘶Ve ȌU"# 0Z4Wb&2q?Gk ȵFG#mzQg9jRQ>.eޖD)&"# 2R2"ҳ0M8!J4%h(E)tJC叴,q2܎GvڜCȼT3,nMJxJ⏷l2`MŪS%hΈn We A0eB)2X)<g/^v@DR(a$dq HeEQJZR9=N$uJk}>]x\/prppz;G1Qe-qٳa{OLkq0,!=$ǬU~"JD<$\M:6PDYkZ`Ih!4{49R (tNBn0pNӹjyrɒ?37R9ִ/.:6TgqD$.:,`̵ˋHYq";Ԥb4bI}sg"L㝁XJz[O`A~kWw {G> qӂ[ޚTmaRj QbGz8Z# hk0ʓh0Go͈w_-N=:N݀5..Q:$ZD ŃO-_w#h|([S}"Э e=ըKajE)~َ;Fgis= eppuT0P)TQHTb.g=d(=ӣB4z9F9$Zj@1fHx&H$q8ÜZ]<+۞Z0Vq4D{o@B+H),Նjb8p7a HFwGۇP pPGֈ&`Zd2ǚf`q{trc>V".(-9S+p\tPf8z_f wEW3PR'}Wc9$8%={TY Sy|dP,7ga5+8g;=7k#!5O=ӪZţ/zYގg͎>V˽4]MJě'}w/&v/\ԧ,*KDY"V Sjbfil`Omiq !BrײJ"?~jq>ƍtbضjR/# +ȮF3ߖ1G[[czWC\qQi@^8/B-2Եu~?YM5yvjwvjwKЊ2+ p*(g%VH"tBBC?σ'oہZM޶=yp_(=#z# Qjvo/2.ry*jd?7d6+uU71`$*dAQ8 ! Ci'C+# J],.3J*{ЊP+T,tr8qArkvlXb!]0p_`2*C09d  I@_eqg.6+ LItKZ*]ftW$R@-dyJTTYCYQ0 s hqV82J&礮7c-f^}*;I\PU{Fr0MPFc:xOFpiG̤D/Cc$>B.‰DtIh)hcً0r;^& P]Po ڱc0lGpJ45ߌA UGIrYTrEKx vƔZgUr2̼d PJ%Vqhǔ,]`ϲP!+%I2!M{5,iJ#QWMU: By?RARћT3s!Eep\%>WLari]2TP%Lh"KsqSm uLhK Q; wU|$ģ8 .j IbI^{%[-0M"qHJJ*8E(a~}#"%˹,(QZujLAɣ ȷT dH^B ex[?߫]X+@9*(k JN3A]$;9yC6Aix ;D=F8lt@Pco z>B~<7gHA/xlT۸wfx %} r}²^WqnK&>[(Q_.2PIMΦkMW s cB- s[;ߩ`|F=,0!e;>)(2Jaz~c|*_|U xF0m1sc DMT$[R sŢZQ{MfM|]6}634>G5NR{e•yBJ h$8(%c%WȀ=ci#wUc&  lt9XD&4SxwiT@^7,i5<%ܑO 1T5E?r?8!Cә IEБ3b*9E∠K7VAGi&a'̥ӗt" *Ϋ$FiE!5 ԟ:t60;aAvwQ jx,Wԑ-0Q).2k^?| K!x0/p=О1OTgPܻ7\G>EJApւTUV~iEd|C~6xO#ۆk#<|tQ8z_APv^fk >~76`>~æQzeST!U= wWƩq!OZ3߾nC,m)_F$kҚU'ϗ K;ߘ *䓯GR|z m/>⌣v|̉'\GJ(ΔHlF0$/+77.RqUD|՜zFB;V~ʏMOU>S&køc8J8)ニZyCC, ލ@è [ I"^ܶb|qjPk4x5z̊omex8*cUiQ?/GeT6/A^kzyV~ ~Nf ]˨za2.><p?yOR*,hZ!#L+j]aFj^jZqBdVJh$з!DVSBjB4͉-TGB$E9! FRsBldF!.%]QP_a)O EZ;PUޟf8Pr| h$IKK'O>O&Ulʢ$jK"Wm$]rFV/lՆ 6ߙJr|zJ4؂K'rrsq<Y]`W~k ~E/r/6KL&Q吐zj-vPI@I H] h)3%xw-0S\e⹡rF~N:Ά1:uWMS,]OE1$ZO vY~m၏j⧟?7BŞ1G ^_C_r xKBTx4dPVp12Zn9PCnI&S Ryѡ+[f\oEA] $jȇ݈S!1eϡ{N> @D! miy@X󑩆p*l㔭{tE`-:~4 @2y'N=+Dk|q!}9lB\tU&<{PQٽqwF"ܗksfR@N}:=#iH$ʶnU?mA^#}HȦ"K`MWЙ0'b3ҢfU6-Jm"<:|%"I?l<x̧眤+!u$JXa _L)d4,kI΅Mo h4PaWp$V?ƋCFأm͒+{5GAugfuw0BH1UG0[`roK aƕъL 6"GdA uECeת<,t~~`M+<(LNrwW|30 "F; oO6q=[%&=< \?Œ!3UTJݚW5}R\QJᩣާBSӏ򖱁1Cy;W{Gvak"]}THU;) ĕF8Vj9-qA]i B&5+- p #H"KS^c. %M=3֪'8U[)H0:* 'hXXG2"ѩ#iEboɋ]}MRE; C u$ZVkibo@riܨ RMw@\o0A&ط厀>ۈ:^BIIAI1S"PhZ0l]Zp.3T=g!Za$jv'wٸdˈM 1C$=My/A?\(Q~¢S c p*޼%ʡd*SGFZ_ ~;~B40exLQ &&=ڟaPOD{Z$돑E_wߡkD DTl`>I}CO=EI0.t/j /} mm}ߺ>uxcBHc6{$T;KtQbZpD˜9)*-QpߖKA LD)/\$ ҞP') #Nכ]{XK<>/pJZa2۝jJ;=Xs17b63cGߌW_̲Yti._}bo.~~п̌opܫrPQSϦ` U0π?m Y^驗ٻ޶5Wc=1_C6AX40H-945ReNhSva'&s3yaUi~]1Ա@d4t0wNk])kpt0/rϻY*xvfo{U5B BgeLʬ4v"RoH??R:De9FbkL ыxZ)?Ӹr1Nalw<'y%_T"5߭0T'/ enQ tT 4dh~ؔۏqşzɏ_Qcz9TJCiPf%,i=TeH8C$!) KaÀQ-yXj* mce<߮f颇+9k2?5KW>?'gh ׳㧣Sq)QOFe@ONjX256 7ПOv}F`M;!~q?7Cn@h2<`։[wyy/K+¯Rbc }SK٤‡ߜLӃѸn?tbZ](|??M4KJ`4Z Xwg#,퇸o>}{y&GOG?Kofͽ~(!_4?M'yzLNp*IQLrcGO/z4 nAgGX:;P gOVFDs|5 p0ew?~:AEO 5y`2=b7}@'/_fM/G nƲಐfj4ǥqU隍:aŭ櫦w|,_CzqM *]&FcT0XRP+V*Q +bey2C/d)&TDh4|%ZppB5M٪>Clf)Z`j0 N(Up~4jTOdV\GD0;gP S 7C / ]!8* =,T~PLꀿfPN)3%\5+jhDש콿wTJoJ+-P5+\:)rFI*hpjfI rWVy]P6#^:(f|wdL_JĹ*3JaCPg EfiԺ,S|Yh%6k^.=8vsɀڬzrc݄W7a+ܱ;7em[ ˗& &hf*-uWFX>Dٿ=_Sf$ èWnHi>LZj$u!Ĭ~ꖄ`);˹B9&9^iW^m2@)ݪ|b_WB0).b5ʾ=^b Q( ðuGOmI /b3KϝO^(p' lV| )SQqآ0M*}1[i *̼/pJNH;1A%O^@4<(Ǭ_4O>Юڿ6ィIuO5:+ *6u=J؎bKǓ!ەv' iNA2>lhy৞7v?7Dz\ %_xяb.3Qu'#>,2[!9qتࢋD$IVXRXV,&JkψyTtA}co`SПXXXXj _0/׃OG2oFq-8^5P,3~+GeP!D+glDDRABDnȭ~雳s>;—lz6f܋m B.. J؍jj%ڦ wUG_Ej%5K2C+A^7_Ýߚ׫q+/i놜~-:Cu 2omy6 8Lr,0m`,mޓxe;gM%7k8SVA2B ]-^:J'ԊϞ S̗ I_bگcL7J_P:̖nKbje94#-K!ϯ1ӓ l̳JYΐ֨<2 ,adtH'+ɓKNC&!y(xfv) 6i9CƯ/VJ6:etZn;w~Yl=#%n=4r2ݛ/-vFUt)$V)1iwb0Tsafm*N~rE^{>.qG+c^oո{Nɸh{3Ԇo]PoSFnpn ]|CS!vr_HzKI%0OY.t8jAA"a &ѥ *0lJ" 48c.Ug]  f)A.*#T… ȅa[qu_[#UUUժoZW 5 ԡմ^)Q/ x GxE2(K*o!xX&&4S '0nъULI&Ԛumx8er(oMRO͆QbENdЁ0*jp[kx$i=quP'ҵkDj)jSWEdAپxP/D=)H"P Z Z㳅ceä㫥f$ݮ uݮ'gCqFݜW:G󃷋W*Y 5p"x𲲑#. v'C* H\A@U ɔ!Nd҂ bɗW B69m?X U:!tbkNTWWWW ]9m'\!e촏)|&$mщs!K'n48a,MN,%b|<ԚEBI ܓJVXBge BVH͓9\lA کt)H3ZWb0P2NE2U*hHd20IL"9 _ QڄS"€hio0"W2幷4^Ip p /\p "KE8TBk%uƾմ f]MpfSȜ9xZcNZ)#Ұ)q[hl+xlB,e5'MJ%khUpNy@X`5Y27] 4%erEPOxUcV# 1V)ɰSʖ'=IY+(,jQƼHyVBI=4lk*W%J_F;ƼRFI*VFh+k blVLkPxfɺ$Jk84 hs$fm 63 fid1kvy,BtR&y݋Gڿkp8!yt烙]*q(]fx=<ɰcVW`k/Ge-FN?D{,[~~` t6_ dzf-\U#_wl2bϏc/oe84Z/; IjDZWJRum2T{thR?0d+O:Qz1h3HAƁ^ q@GmBp-"C='" ċ ;0s2JG#<3h?P1+fKK-c"э 92.CFyhFGD'I %|tLj|V(9Njϗvl|i︆1eC{R fG1˷ fBy6Z+z>$z\̯`g;,;}իnMO6i5n] }ѩ](o\VU\VU\VU\VZMOv4]FrCLOXʪXʪXʪXjR(h[: fڞ Z؛BQ֙r!oGfZX>2ⳑ2rnF@еdD ɺbQY;XjQmE`z}@>FTiQSx/ZjٖRwBǺ,i]9|q̕ 2.DtGowx%61E!,JM@p2ɠ: g Vj7ӥp)hp/,l=g:w6 &WiT+X00-k]NNs:e9M0M ػ8$ȬUU`; 9'O-mLQr-?zIQ@1~F} ֶN(q zR0I# Cuq1㥬M:9^T ٦RqDuOȍ+hŰ$ӹaHךRj!s8U^)ٰQN6E1b,Bj,ƮCw4[-:&kΨ ^mxP9^u42ZvT!^sOXS0@֙1SmHk)5=H"0jXeuD'YwjHš.HbTLaX,u(rhE;9CZDD0mY:ؐU팕+2fˣ\P̠535JK#?{83nČ! 嵙Qb(FjG./tcNj+LyØ=SE1Q5]aG[bYq&xKH dfqhmc!vY RlUVT]3mYO/m+IvZ0U~Ҋ`ր;>J9ric%Qt6~/$hQm( Ai96P.W,w7-#8B66;ku@= cSL؉RFٖ@."0{/3jB;j3ܪNؖ@RfkU8?Ke.{κvx@wgM; 7Bc]E'གྷ' q s!-⌙j c]ںk>;)8La%sj 1 NZ6oȺĮH͖|62 =HR;!=ӱVlY!#%$MI .\N"aOAIƞ{5Jԡuo!١TJ!kF6b.p~2g|hz+ma¨k{JH r76Jk9rŴs%h 'ɯlw۝M2Wfor3B&]l/-ʑ>HkvҜ ;a-K6ޥonKষ9x`ؤ c[^26~y p`X wߝcYSk>30PN?B"u7׍I-~QugWեƛoǛz-N@:pT[Ф5;7jhTjzl+RTЍԤa;0C$#ZT})Ul* nW*{tV3ɸӞku wàd6$0nr9 <>5d=C:B6&F6ɧbBIEM6aƚ; =ޯvjbOW>Q ZP];-FI3#4ޡ奒,J%qB u4"RjkQ[.>H ܶEYH$=0 m5&pZHZў{PhOWgfHeM΢dۢE-ur*u_Gu!6OGWIJhZ"fB:ݔkRqD-__:K/l(j,*g`!ݹ=3ipn|KCW,1يAtj v&if< HLM?rk8H`&;8law;C<ݸLߖˆ ?}mĘCg9~K?]hV6iq'>CC{*X;I1a7BۃH#q_cZ$HT>_{^4ZcꓴO:;1k՛'Ys,jyh߿y$OAK5UC +~>;Bxm%OퟵW%WӨ:f&13֞b lYa$LNJMC40K?7kk-n]y=G<eqN6g(h-H n@ӈ% ßmqH:8mVIK- ,Oq~MK*㸃S$r/xM#OJ=rQI<i\'qw0_8ef_n8o[ڍO.\Bnpl 'SI;[V* ?K@LɒXrrXB[*_3o]"8|#◀ AX=0ޗ@vu0_#F;GgYnE<~MU~<Ɵd\<#Ap᎐-A;"bA*":KF[|\,f?OF5IǓjr1's߯ $b\%@#1odD&̻.{ jc`|`ß}Hd[~sWClxk3*xLuU +#PWT&ڦ(˪>CfK[:3>Ly_k1՞g|6ŨYВZIXĔi|\TOe8B#S{9'vnbmҏ\9LaTQ qD r 9y8-iQ9]KB&atkڥ &TlO7?"yƹDf~kDřKRӽ Pힺ|6M f 7*ē} Ήʝ~Ξ͹{jRbJ2%$֞Su*^ȸksb^E\@wb^]@JL_%/ PmDҠ+PDR~װKluT*SU Yjo=gGJ_dNBPz\#Fki.3tx2dt)7`a"58aY 'W!xyO'SsSBF}ъs ^Lh"/' riwӲ ~dνb8Le,mr Q4䌉M*QrzI1*f"P'9F]C _u|3aSb IZЕnrFWPQrU5hz_wUOŒx[O;&3.a_3(AULB;[$V=:!¨3PՓN /z[:E3apfں;N5ۻ.E/y&`+=U?10k>ܪn5X}I \Vasj$e0_jf~"R#o7ʳv>jU,i5tcINIMP[D2]Sk!)Du^9Y JB[piD3d ZMm{)uF%5:k Ԫ5V>hPk-[jI_*dͧ5-MR"Tb|P1R[mƝBM* b/$ pJ޵57n#뿢nN5_jvglN*iS.'H\ViL]hEl(khts-uHEyo@+2ej-7Bhe)$:4a]#1$\`%k;%Æy ĆIQgBfJSč7?)GcX,`#x;|[??7JBlka Ux` F<0J&@`C8qX&)P]]zˮ.e?pD-d3K=u<4 lhݖN䂈*_ĩؿ^`W瘫p}pBO+?&)G|ihr7/4\Jg_#kph Soȣ+QأGceI*!q$p@Mh>kc __(u1_]1,2.t^?íAD$&e01ǟގn|c!Xgu:ξ=.,H*kGyCd;/V_hm)Kn5ryKp^׊~]zi"J_F[w`Z??Ϗ߽Yso|M|<=1&H(_1= o@|'3}usZ^<\]OfJgK̤Hje Ȱ)ɔ*n=e&5ydʼKMN߿/[H#h([Tw|᥻# \^eHد Ň[<=ZZ5p>oxCs(>O޿[~~Qz`m8~+޿2x~4sdyu7)N#^߹1,hbsꇇ-8(uQ;^ʑi0y$|{,B8rHdKQRq2( e(rʠ\ˠ $JE4D̃YˈRCwVl8P ?xS@?4!/}p2܇,߇,Om, c|4˜tP5;aM nϘctFSOm ꥞NXvR@rs [Ti.W%[  st +An]c"q_;xx0 #l[mY>30b31G;J;J㰣4!?xU ˝ @۹ C@]= խ"CUKkI!LTa+%޶ _Fkfdw P0b-VO/,[3 )aw2٤t\x?l QF]K0mS)Km9SrZڮ8 LNzR\. P*~@|=GN)VDyzewT+4LJD S4VYɃH*]uw~> x01u;+bT:{|(" խŠbKB*j SzK$_(+0 ANi)^կ^F=RVJK rU)-ɰ140GO\m#ՑrdÐSrJK8Zz뚼mr=ZÁipPzLfd0{ZHM;jeW~oµ݇>]<8r<1 AfG;7ʖxpBw|ic%PDɥܰb\?95; KW|KSE*X6cN9pB :*e\[۰?2G,s(S<%$‚f0nT 57fb}U/ <@ gC<ķJ'DB7"t=5Wt嗯$4!U3&Ǵb9)&lnwLEg=9Ĕa8M(h !%(9DR\u6.Ji˟H,jYeHQQ0ɓ^ ;4J(;јg4*٬/J,o>msgU1"5ɂI|Ŗ$&>`:n&B02B ޝΙiX;R=*8mP!{4%%OM6)(lOR\F}oKQ?SbO͜qM[Fl|֓@9^ԎΊm^l B6U h2g=^ a^Mzٕ,D%iQD.XIH+-Z~h/R=:)3nD"TcM~ڦ-kM{ԃw(Y)K]b +]Jk]O—2{# Ҙ̭]sF3EYh[O,ڈYIKk#%# 'V?.Zwц4ez@4|蘯{!Ь' Mx%v\jcatCu_ٓq>"2U'ƒNm5]^0!P qZWeZ{n2RGLzo S,NSR*yC4 6RjhO0(Q`S0Gx.1P.N]+ݨMFJ2:5;KG6oq7SH1!9![_B݁*#WAvdV y2PI)B1֚G_E9eǝa&n}X֣2C\q^Kw!',53`ha + Ҹ<`Kc%@*4DsWPs4R2OU, MtJE"13Dzd7pqw&t %x]|~oR\*8A(h UsKl@A`TډF mNUr)4р E'5hdgns#zRGikk2'a0ml&VR16>ɗ+Fp:}YIB1~c^K,1\-+]ɻ~.I}624t`ǔ>D\t8Kf%-4gHF*4UGl WTNGCWf`'vT3#c°@hllA,?oGf%݇--Y59b/_S;0GM _‘I . I-5bTb|0;P)ˢdSm͝XX91vXH1Hᴇǭ +Q MS ASʵ5H{%LB‡[iuT4Ůt>d3Dڄ2r8aI?2m ֡ rW1P{Z㉣u}S%F_0u@)6 ,M]ROJ(PsFZz<&ˬ6*@_0"%.]dE_wϦ^imH{Y+r ڲ&猬irb|s)܀P%CFDOf*Nhp8Bt\ĐO˩;c'4=^|LXYQҫيP?! c}S ";Ƶ馘?ٿ#tVR*lsy1pR#Ϋi]:\bM\:OԘw쩜Wz+nW<!  y؂cOs`GJJ[ǴDy*e)^5Ӱ]~ֶaDG4=׹ΰ |f#W=ɃۑT>(#ZroտV3 VL`Uhxa x&Ş5SQdGnԊP$Z[/zC@V{XV㙤 %QbTq'8Z20U%5ږJy_F@䐶Dp) %-- dENcu8ArNX4.:`+]2TZ}Ø=0*1BDUADWz3Ekt_ruR<oxEF_0GiUc;lI sirФ,Q[dJFGeb<0t4/r|9ZJRO\nsWX9 Y!$rSAq1ƵNQQj, trlD꩜޿Uq.C=N3EܚQZ%+;7tD ~>AY`,kJ,1¼P\kc4>G9DLMTMA%/1V!h3ςu$:X:m=.%`pl:T>BxOTu~Q?=c\s,? WlUƪqW~v" EK7N3473GA.hN h"}ǂ ZR6@qs~v rpiN`#RnjR&H,dd oJyӷ uJJqdb C!=Kk c:N8*dsQGlQi fmt8׵g耓 8o9(_ (2{;/ QҖGYAjuX~|oH#HT#ޮͪ+Gé\˜͒UulZ{1 $0V1b %P FӎI1S-2H V{#m:xǗ ċ̂pfn-InyX  Ă#ΧN r>Ŝ!8W keRI`$S!Z6zwq`Ta2,( zU>Gm6~}bDε&`'1'񸃓ԊbMs}| Iv)UNşizΞHR\p}_'.#<ٴ Zp|+KHH5§nlxç{3}{zM, yj[ \IHl4]_oч˻O{p?D?;AB 후]^:=81wEX锢$ʊZjJx}jS^af~WJꡆ(9rY be=h8gtDm=k9jP.9 ?f_\IBzcF" |uդNjT&Y8>Y^Z8ia᥅48Y1#TwrlS!&(0 8cʞ瘋sP3Fw!dv8Qjq.|%5kV:DK­C5m9¢\|񝿺Pw-69PK+|k&3kjR$ZtSk8)sKra8$LuJu`UԷ+Aޮ,"rt}Nnf`h|d'barJ'%n)):z6ɺڿ$xSgy <+c]I fDzm(xU\ D;au+QJ>7|\FTّicLAp:"S&BNOш#tE>& {N`Qw߾P` ÎVhΪ FKobC{4|Gd4[3=سgqI~\{[~YbwJ#α9ǜq9ǜq=ޝR{wӾlĔ!db{5xhJl_^Qj}?ϟ7R{X=xJ)mOP)9Œd|Ṉfyo꼁q3)f6'QR_Z >":LُcځwIY+ߎ8r1%ҲdZ;Bh+Ih$j˪?)kZh+;HTI\Hz5S̅A>lf[b'Xc=v! !t;92ĵTF'vu,]NPKH)۱\Hh 9Qrn. ܙ@wfUӏjtccPd ^~ia./@,x"B?$KӛKʟp[ʁ3.# p,juQY K.,6kUS'@0pk-MϛL_T diW!jNJdOy4ѵ=jzPnwgҢ--xcZjL(EGfD)^ݴƱi^ccSf86}Rz:q0oLI+e<'GT K<yWTX^=>aƝ5e1ŵVUa,ckO.Tv <"qf0:>! dEw5i>voM\y/b5 `/-iK&]w%4fE_?U, FtTӥ|zYd>c|: ~z5G̃XN.G آ@X# bG~՛?a&ǘvFU-6~a1磇|6#_ool0GKS*V"'4R-rr#q،>Mȓ1\Bk]*˅(ԅW璑RmV%-,pT.7YZ;iC{Ё96]sHVX;&1\`єooc1p]7pM~ R1 (هF[Gy*$ͮ]foLMn~{7sWP:og7N Ts=rV9r4w1^ԇcZ8B英|vl.}rOKITHv!)C3CR^BpWUURQ8v H`- oTq>ߦ}a8|Ӈ@{oR2WGIW.?ō@l ڒm/6i@ k{ //H7*%ck׻ H=YPbdW>K+°͙|,sTpMq|p" ъ,k_z;MU\4G O W.B$ ~ѽLSy6F*T˜ ewi|B3[&;'.>bi;i6P5;aNS* 2(`[pЅaVK"r3"Ҁ ,x j(ҜKE(&IJPޒR2 Ip,itFH+32SaDaEĩwy$lb]H)~G!*P(EJ ,z ^Koe (1 GdR 2[L,v Ngڪbo}sqHoh"]{ -He( Z[̭W%nGYf'G/9!. X <̮!^)M( zЋ,?g4 .y*WJ\S=ȋ̥™e;;<Ӿ{-1n)Cp|jcyĺ' 6Vd뼘8v7TG02a$"§ȕj%g\`EK7aV.;8j? 0'ƼE~+6<8 7I8~ mdf3Oq9J]CQ;qF*oXГdd5ʵkkTQ=Vĸe6jҮ:;(. N/E6AuWGA JM- MIa$=Aut"f" dz}N) K 4*SVMDOoǷM5#z//SeJBLI$z\bO}$Đ ` a6R!4%Ţi:R^x.F 7jqwvGo}燯Fy9R~yS Ila41-Hb@*T+AH $%ڈy`zx˵%Pܕ7ׯT?a + R0 HARi# (-1T 2 4BY,yEgU Eҳ 'T)z~c R6jX$,RL"5%)Àp^2% hP}w t)|ցPLy+]uұNeiJt&n0HRVph5kIT6);"G`LGf 1A &R ;4^\tBP FQMkPF !=ҁ:=g@ъ$zwJr&2P6tNC Kx@78 {R3 vԂ0Rf+XpQWɪqGINecPeP/[XG3;8n| g}<^c`X;<(`;/@a‚}j#ȴfH p'1%W.TJq9,f V25s_&&u~I~wGvQ9>\Q*qTDH#./8㢤 ̛dws}aj xlsuje2-Oe߈s #afI\rI ɍΖk(;,sUjM!gDZi5L s^vөXsgZÈ >*y%nwfܮ )`)sW&V[qʐ3*gKd[ .Za!GD1X#t!;1LF_!|*,½{]Z;b9ZF9p%L$ '^06:Ʌs8X '/)7ry&W+Z'Ȱ!Σ28Y)uJ ;GB9~Eݐ684߀.n~* x pԢ e/EqL|jܑX,{Gh9լyjz:zsM: _EwGV/~Ђ؛X`EG _Ӵ|_gF4\m X{vWayq: B U6rJ ޤO֊Wrۥb/s'fՅ JW?pR[o_@(-,'2 v"dW룃ka1.Ҙ㎌:P6YD*z`Q›(M;Ϩh]Rvnqcxeh?tU)YD'W@VL({EUK籝ꪫg@;d"{ؑA Esgsaځ~Jֲ> K8Ia)Ptt )v4)V>.y×<R{lgҫ_ _73Lo)l!}j쓙;vEIY}猢TIVXOL #(tgZ ]uo,)w6=jrXQ Ei}8MM1 0 #EQo[>jUةaҖg|C,Xv&݌?d,<[aWf`ȞZ }>+?/s5//NN^P-ŋGҍ3<(ʢN7ỹYэ#\a~'L/ga0ؿحRx> @_/F&qYدed>ba̝jNaƆi ¯f3GPyw~xM>D,sBvjj,ࣨWr=I "H꣜?"h=Aל5@F'*@"cӹ5]q:#~<;6u XAn6l,mS,՛O/}>J{{Xz ;K%$9_uuÄpxRF3/+3j DrPc6q64#hTg}_UVVVVfDԋX;XwuyqnXLGy"yg3%`r7IPZ'pw C>[U%95&2ЈSL8 HPXd0rj yYNX@"UJbUe RtI^}%l0Ki$kCt"pzv&g$1=c0WI܇e Tl՞աJdPW:r'k$dL1WJ:+ȝб԰Ha2Fn+#83v}q`kt`03>CvZ0LZf?VUSvnık­G+,Pi\:q3Qށ|~ba41^W^ʹ`g!CN2iБ'0!2Er6U2iֆFmBT ~΂ɉ3M, RFաKts*!NDs,6 ,>W6̚itxF M;m0UnS 8*pIkF8'+.+ }ɊH P/1S}BjcGj\E*"'"W{AN="s Hjѝ})Ƭ,/Nj/ԕzB/X5VXw?eOyǣ??r y1cH]d'AChkPWNVA*Yk&oҀq 8DhEuƛ6`27dGOxA"-+qԁWٙC>AB~(qi d_÷V̛-Yg$8Շwi7k h*%% z:@))D?a;5ٛ?nwr6 @aYj։H=ѿn Sv0.el>5$ +6Ǜ{,?9u嗿ٺMsF9EEp_ OJq~ؽm?.ڊq-BR+)Zz[\ɴzZJL_ڱN%Ύe5ӫ_bkbjw/sbfey]>ݹ-F lΊJwGkmt/WuN_bnfiS\9$֧g]%"( 嗘t ]}X/Pa3/}Sl~ZkK[ Jipxa:"k%~W΋{Zɱ/!F F68]ഊ&̘VehY5Mll&7{i9ACn6x}:+,jb:|R[Yh/3%=^' ը=p{=X擢l?z7FO}f~e8Rp2"l`h[ ]NiN%#^ [Qܚ{-L BpH-:hdAs!WI,l$9hJ%xؔ*{c /}榓p"uk^P欻o3iVg>HH9@j2@zFFhWg m`rΉOk_=aNnVDEG '-G:g/F(±CK80K'0iw*lזOt%EoeNk)˧;q=01ƒҐOC{9,=9qww-$`Fϥkx#;/2F#k{}՚ìM!@W ގZH^u!B{/]֟U DN斲9N8*oz $j=`ՁHv"d-= Q6߿p}o?,َd8pH>~ ,̿E_#ҫFV>axs羆났ώSRQJ J,>hsןz}o˿{Ƿн{c'GQ֧}:Q)hcڹ-.OVwO3)Ni#cD(Z_B}@ߤ5o6 J-{-T6XO]c5nFXj B[0_b'PnXFNnRXêZp]&E4y ^idRZ /QZL)ĕuc>p_]fwEqEv]>eG 1:~9sO#X7yO|=rAX]JSe.o9|1|[,k .S$L ܊ۇǞr$X "#xLn\Ё;I>g9 HXrLݪV5Ilm9dk+Fc\d9^ps;Đ>͍rBpzbk!Sa#ED Dk$-¸2Y2Y2Y2YYˤVw^ 3^r V]^(>tK~ ZL(g&46@2X̘CGs!zW-4;b5_·a-uӵHRWUh!,­qM\7cwvfAؗ#%%.P;?1z5~օcEٯo`4o.""C4[џ}Tb?L0w}GöiE8)}{՟ 'UMY^,`VIJ+rh!(w[h+vlW{eL#Jf{9!TkA:#úh$_i׮ }Jsx\|}xDt0Je+a[zRs2\}[OrBeZM%lWylYGc*=LOs%,W3'm66`x%u=%3lz#`Dq`=[+XU76RI"/Fūӻڢt9|E-8NDa|+|>| L>AD騺/"`!]z˴vH=AaM|;mx ;E5NDgFK^FJ&)Iџ]Νzle`w}:g?{?O=:'=wӑ! +h~iP-j7 vW،3Y9Ar> nVX0Yrbx 0I؋ fMAK]wV#2*筹/x5:k@| {kGnStgC쒒60?a}}^ #u%Yt}r\4TKN`Fu ~ XL\2n^aD<~= & =܏f_G3`(įhWr:W_Fʒb/WK$z2X7|6ʌY 2ƃW"0w%}a24_dĒQ Iɳpb_,=,Q4Q IyǩӤuߦ%s!thTyW]nKDmL'ʞfCHFԎ"GI h!=-iy]>6y{o/䴹RP/V.KYs_;BY*0`4LnZ<`E X+2xXFṀj&890d%PfF[֗ z6`T\8%LkN/XB{a! V_#IB/ld{facl@ȳ%[bE.R<*Jl[VUF}_d\yykg6CAEcΦLglw,̹c 8Xn(cr4B.񚺪$X91jBhT 6308ZA?(1e>d4~vRjPp4 [C3 nUŽJx ԕS6;sݥ ,p9$NM(Qsjw eHG B݅ZX_ wTa׹fn/o625Wgy.e(cS E.I|48֘C93Dx=5[]?zWˊ6rAD) JKGKT{P@& 9{fzr —m1S֫ĬDr6uVױ5\`D)&VPF+ufS qAMiL`k0Rw+!zt{}#j+j}GɜW_Ѕ?z2ŜmTX ʋu"m<ҷib)W=&"F1tC-Uf5ia,ּ<1c.Nԏq@7,kZK+d CTLRgSyPְ09 Gg{ɴ7v./?QUl{ }He{cV\5wҀ*J =*b\|[S/1,D/ 6 [mV1G&n1UjsbD^ݿU}I%{JtZn@_wUaJĮSC)B7sXxi pF`i$_{|$ˣ-%#[5}UD~K'Sy%׼Ǐ]W C+]hR!tQ.zH}j7%T//kC~a*wZT臽c߽U+vI٪cz9l Cη lY7Uj1NCÝ2i8͊6j =k]z]]xfDZB6EʜƬj N&4+xFBGh1Vzz d Ah^]U޶`!ݮ9 ;NC؃T ZwI}M;;Ռ=+8{49v N\8`wrc9L!4{-4 6#יEqMTu,=7qm~~VCv}UhV޳Wxf0yP 0pSÙTm'RbA̵H^)dj ż\R6hs׳4rf|F.?nzzъ6$g ?QM)BMW{eoO}lF01͜G%ҡslG6]3 ꘹_}~wGW1ydD8J֨thzZ~^.cy`p{ՋeP)cD3c\o(VJU;;-4܀8|1fKV*`!'.9~Y1,KjzG&Z1J4IhMD*/ì$?<y\{̾.)rK"+b4"0:4J9MiI7c:c] d@So/ۊ.D1{(#+5j: R$N#ZW2qHZ+3"HL' nqc~RQ瑬)W2cCA>$p&S2I->[˹*Gt)d׋){MLNB e1蠙 B*pA'fRR`$2XU?Ja -Wg V7SZ)ESh(̉frr봠7@qEB 9!eBNʼ 6eČ>X-Z}wWn?jtPr/zњzѼHʐ}LgƧT|Y Ysc):sAuJkZ.ZcժU:zeyj6C`) $yP9Q=BrĠ-Trq}+z<5'o>6IJ9RRzd,HtSZ BC%!0wPfUA`x60%Q=BD Hee"d8uDOnSl$]5',:i}UE_1yBa) 8Hr:QvL8ZGɋ`''4E*O i8w7e8|>޺rجJOgSyŻe[pO3Y$اϿ}IߔgG?_^'j섐 MϿ27oNxN$xwuu9~G1̾F}rN=pxLF$PH9'Gg_ZVP~7O8.y(WB sSNQon{M)4'ezse\tM]jb/yU#vo(~3ӬR]q?. N[q)up d)#1iyK 0 r \"BBR#$I.1)cdZk 󡋡$'J>*1]H S&  Zbr:[(br:fօA9|ta+4GCjbyNH]]ӽ|uD#eH9e(,07If4я0d.T+#71*e1j3l>;m,3zcIrY5Iq>yM hMD?U>4-KYf+̸zMSYId̳ZM`Mv̂*uE*Q$JJ2."`&+F_KϲyJo c^pc^1ŮX5ԬhRRAio$p!ŠG\:F]1"`Cmx3d8S+Wk#cTϊ{KHi0#x~0G`{-L=B㏹L(n `=(-7?Ful3dHOYz7Af-n ^Wk5[kno9˻8 |?/}~+x͐^nFFkno'Uwڴwu< qů8*;)Zª_3%ٟɿ[Tk6^C@ c]iO$Q\8}SK^X8};GRA%C'eO*%f/08zymt,BFN.ZNA1597&\x6lxS"ݗ>ޛX,I=LEk{7|7]/$\ ":; 7(j@J)HvKAH` >b*M]k{8@/eM kbXv Wdq/!Jl_r#H7%* u> 4po R{0A(HrО7mҷ n&BZ :5|/s{#$%XmChi0l@A:G@#AD҄*Q;~q GZJ|kӬc%('jN?M{zh⻋Mj{D0Z@@I]族ux5G)/o|\6;b8Baa_ Y$}8v۶ln:;5<:˛p]" oCf>|ހȂ.T),܂{z6 va$!5LŘ<2'ZUE $KS!NmrU@4=(GCB f)m q;\lT.~UM6 N5Jz Lceŭ5mkgIM 5"Iۜfz=̴߮ + (h Iy%7d7=&h|E>űEk˖ [/7pLbDa*r}_1E  Uc9Txu_TQC/^L魋kp +TgՐbC+*5#H7Wy1z'ӯsx -N95< KI\RoH5'dYĕ],?xxx_O\xa_}\/S2qq]&P4ee(K`TGCTcNT|(cޢ?ߎ.frk'lz;J"C ,=~ʧq͢|'7~x>)he%dݘٯaqwmx⡖Ἀwsi"0̔ia"D􀶒DT9YLZnB":j ,/,IV(} n-2nt=JV]zrcjoj!ΞgxBq{rcS [?[w"qD&qa %rd/bd5⽱V0enf ؂D  J.Pa8JΜIaP 1'QE0Zc˵D>lZu>4B8Oy dt1e jvJGrX;eEHEւC:>5v6xX Re 뙄 &i_0ֈG4"ݖh&Ü})g A [=SwOǛ xd`!x lBHq<- < HATE`CY'RYM0\bDAp<wMWdqՖ| irwQ$wGEݱ/>,5Gh h!-g;.n6#R AD Ҭ@F);8b X,:b%ְ]:u6hk,pR(sAlvťgQyھɹGJihqGJU Y}NZT+|`բG:ʅX Ojj7N"oys.7!Њ_v{1h5?Jըmg )#Q6  4 =Ih&aA+Լ M9AB !9ZN*cz^#[G#vB~9:CHǻk_,G ^hU=IMޭ蟢c%E=4ϧh9wv4NAQ5\p("`[}5Wg25q<H} wv.2sr%jLcɪɝF@G"XM{bdEJB p‡&h:2"CJTf+I(JmL#aǨpf yY 0m4ֈ:ED h9#9;v> 䬧)FAl=} PYiki9Yb =GI vVrRS]E~ citD^PڃPd€q :7MF:eDV1I2XeKB`kEQ0g#ܴ(aԪT)&%esSy9r Zag LA8,zǘgA)͢"Z'ax>֒UC˟E%I{# ީ~p[u$$%Cym= 6&qZRr^b瞂H\wȢtXDx Ǩ|ěozbMOB~ z$GCS/+D5)T3Js, * [>Pc(KG"a8Rk#ոn i=t.忹3>~'瞾MI?ڝΜc/:P1K# ժ0GN#$@?Kcq???kiƜh ]2͊k)[XZBBo?kv&3DPj2[|d Lb)N㦒p$A<\ ݉riK:vvwU?jOf%Ojs'3{s=oV~,훵xXMKS'IYc@jD $lr,!iQgXA\~d1V(#\ 8MV  #d1j9ep!NN"p;UqT&r/-‡  >`0m'cFydW)6ZB,s#aXr:"_9\(A7ѷ%ؘԉLa\KȻrMRͳdrLRt=9SScB`cS9{Z0Kw..qX&pP)9+7JO$HyK>(FKmFc&M i#>RG \k;<٨?t,@38kX]~BWDpBÉu~ޣJ\.j<c  -9&e1e)^OhtwS*N8M9'i;urR7.5^/jA@oZNuTxwWXT908U0׋Ͻ.cWGO]x- ?ՈؓŞg<=dܣ5|+sqT@=ny(?퐱u퍅uhiL$ﳈGAvr:F6KwZ YN:evƱنx&`kh1稸D] , Ɓt=SK#̈́j)k_/ P_4w Ǧ}'31{$TP,z)81lZFgs^E#6l۝Y4=ڃ'ײ!XդއlU\lt{_g>0_M߮VS‘4J3dt# k<濎Ӓ=Uuɓ~Pd#j z3*g*{35 0z.N-+knŧɭ@}Q"Wu=w\$pҐaq3@)f~mp6nG>^?3N9^3DOY_}msj^,9zJ/,KGC7vUZˬUBzQgzm`!uEfYjJFљ^eE**Y{^(VUw8%CTʪA]uR Rզh"C {!E;aJv0&lA:Ϧ\WξB9ʯӴ5TwRfPhh'M謎*\0~ݰd >ϼͽ8{ zkIq61 ұhQh>" $4~U[úUd〻 ۵Gzh)j)Z;Ns,QrOP0I髬M &Xx;Lo|s}]R>^U;FR_.nf Xg)"KBXrUQ XA2 *a5C,9CZ(&;{їt66`JN9>Bi6xлk]9*p wC.D4>EӄcDIʸPcIv;ٱ.̻xA҂PFt ʚ|ِi-@0EZZ/U.IJRvOߝ2WY[ /ܗrr2q\⻘\$ei%\Fr\@ґl`a(`LeGm_%?q;.`%Xڑi dꀡUP%H3^/DR20G fbԋ 9K1ZĮyY S09-`V>ikG\4" ;50@hCq 3"ۓVSyXJ1E/4SzzIt/lM-ID.ֳUxhb;:w>mK@S}RPSX ͑؅f%Z䕋%LjFN>u,!vJc2/!p[oz8;B9;*Eѩz:]23ˆ] "^Nd3C.00#n!vrb[ҬpTբUQ sCU!J FО1Z )SGH4 tB^H8$F'aE4EC. XfP|V>h6^UltFh!ЍzЧ1^ '@@_JD`t:U#RN [z;p"Y'^AdL8;P<җ)i}D(H=B Ɉeb;& eI#/Pn# wRLZS7cO=cBѾEM^NB]r/ Ԕ攽8 z$ ,^v q^Kg| ֬Y+&MS /tSYYGxRJy:8L&A5H™|y|nzz7'$(]?D4 %ez By!`UG!(HD ЧZP$L(a(A4%AD=LG{r)~UdzR2}"f\zpۿ1/'J%t}7A Ȟ]me"UPx֝pʃDqtDM53LPW.m{:h(6F=כt:{_^cdkk K[w7d?%0y =zA,Z'zG`H_ "qh\yJҩ*W+dZL`oaypv-\7Dmݪ\@en͉L TGw*nv>fPw>c ,9^!'8K͖qB ݈  EgyA~]P&p\[s2s Mt4N@.Ѵμ `r8G]|f80aave'RW[B `H3ԏ\w8#'e3֘ˋHZu /uJ9Pwwu*\]4B⦹!/" pvčK2ѴQCmB!zV?O}ޖz[}fFƭZC_~^O $x!L}J/=E̡ 39^.^zQz2xOFu7ƌ(3PDrEűEsHA),z0~}D2 PJn;5 Ql]BeT; ZO86APic߬n($#KU%ez&a8~ 1ۨbc~T]J!ϚQ}a!<!?}79bXHb%` ڞvJOPJ\+f$v̚$TmmP)4f=y&(}0#z"4BI,$LfaC 3 ȓ1gӞ1Uu9!bz Sȏ&"F9M.Nb~`SKZqhm7޿!EDԨƭ.0"fCV5`R{ \~Vg/SkĶS)aKYWB}g7E.ouF $iݟ˳5nwaalCT#vAh# \m*IP'_YG-*q6Ĺ䍼U%n`]<AJ ABbUCń (C!(*w"Cޔ=Yfs8{*9dzߒ.Mbι %UJ/ʟzDAa|RAXp}if.s: SHwў"z#-K,qjB/0Nח " ‰tqA_[ (#M,zhn.R44?~2xLU/ɲ[~*Ov~P]kM _em~MUљՆwAzGXgw!`*OjQEHa8da$C($E( 4 }J!cIe,Դ޿Cj4t{@.)d8ѬY^X+N]a9IJ`{ LFqj.w? 2Ӧ8wJecd nd6la.5H"˗5&xiRBt4s:Y#SكM^0OۋA&+8#UZJ?B-@ub7~T`Qĥ8q'},qĝݿn7}ńz F#\ƞ/C)B\J!?>Q+UCJFɐcye ?:Q7c2,33k9z Zs3[21v ׹.+9CVop%_(Nk_^7s8OR\"/L<88NHok]zPT|$n }4lW}G C2W|iJ><~Q* {{aiy7ru'^]bQV&3ӛ 'كVxun0,?B^߹_7߹}6"C4h/=3EٽPJ+Dw;_?Խ 7Z0b k!8ܝVFJǀsn ʹι"nSg"yʳ$:p ̷ZW;mY,o ŋK߽sdҚd ٝI,HsBH+ŢU.'\j+ J{Ȃ#Ztd$ MBU$DkS l!&5&B8_:@FW0%ϧ䟯&T+yRwA9xy?ܻgDŽ<0|k<`Z?`D sn_]Lh0,W:s_dLSQVϿȂ5'F l@CFIcD8XZd6q Lq9ZF-QQ5MQP TF.Cd\hLd4Xd-n|r̩$dx0WV sDwHner8}h FšFٕsm\}n'w_ϖÅ0nÃ_^1w-DnDwm9.XururururQuhmZp_-2n+P@H $ 6{upS3fJF&K: aB r"h"nc imv]9`*qnΡ2hƒ!D95z44E55TܰT mYZgMS;'VLv03!KSyZō.[E+]>H`}up EᶎuA(ܺ肖BA%.lt>h=W]#JsƉ5kQ 2fq";Evc\khJ2g;HǃBycn"9$8֨Dy0[ʓvjD;ֱ/ckh! t_Z#ܑ$$:\Q ^BWy}b>eS >!r(!rҨZThCrtS=@}x}x,iYi}~![x|:_^ֻ|}F9p2j:H$AUh3uqT0S3qr3++ tBwm)Z5Njq}3>=5o2E<eԺ9ylQ1GW P @a'= i<ƥg輻r:yцjL _]'ɛ6B0i.iO: 6)ET!q_ 6*"f%e:UPz0MqBJaV&ZE-#mpԄUضMk2CM {r{d>3y%+C5 nI.7-TaDgv`?sHL'+DXDBUv1HG \1B)puW)S(VqO$Zi(]i"ԹZ@jm ?[Ag(N{nD1)IB 2K' ps[=?Coi6d=Md t]R$T;̡Ag٩Xi%3@c3RMf6?*-$E8>4A6hLJ70JSeGY#hܕ?\i6?;Mޭ7`%r*QϓJyTdlGX6՛,KIpVs3*Vu:ePJฯ:̹LqFr5H΃u+tՐ1E##5aa>| ⬝@T$c-FNz0LPE% 2 \l1V7 ? ߗÜM÷NdӬp6p= E`I6@3% 2?EYH4j2lt-X.ROyKhs"S%BnIW<] ]O8]rl: `-.Y*8n! kc&EM {)p*B:┴$DqIbAx'XMM jxi=e"^@Yq)q+vgc!}S }ue%u'W>vI+W#_grFql֖2*ц@&-'x'Έ zVJoQ]QTkX):T됟=һBk9kLvʯS2%N=YtkEJp6OGv/^agcyvsƮ!ͰR4;riZ(9=U&vKwƾ]M 'iy:\騾_׻|ʼSԘk!*v󦻟4Ҫ!HW_p}/Ѓ q<~[S\fLwhqm<_o(QuS|fo .iWQhHhrj[{'n[4Z<$*=eloQh*&7r0˵nk⊇"z"Zo+^%ܴUlw|3 mwd"GzEm?N/7fv\7/TෝoъDxo`D̜gfļӽ+*]lgEk9cˣm6cObMpyAEk ZJvr34I rtϥu&[S>9m^|*A^n]y r9WP̣P mWS͉[טl?g-isAl '=CLn+?̽ФGo lNq LI(4$Μ Lb3kewBY)HHkY%9șbφ3fx,iō喩3fDqpOvu 14ePӀsMlhl9g#n&k1f{r,AU)ku )RN5ipldZ طݷR8)bҐŊd'𚋾ό1=%&ւ/0_^`u{LJڎ:֛jS=s|KM,K*_|w ^;چh<ɢ$c},.pVڽ~d^ m'jW;+Lt7Unu֞').gEDN*pθ$ײΓe0K?J}*Muz n|&U2Ox" A|je]3 T= ҧL\ ,Ю? (Vf$o*GDNb7М&uז)Ret(3⤌"*uKst"*Pֆ:EtQ.v~9\&EI]" ֞}\y L#$4&pތEur R4Ags@1<~fʒFonx>hޕȜ|ʒRbF>XwU~?,p6cIyFơ|C]ƈ<\A̝8i]WJ=ͻq4 U!Q^w8 Cs(y;Gq,ع[lvϿYkTox1o1L<ǐs6G9wE1jJqc|W.; וsG l5/H!fuu!GgsGgijCX\. 9F@ ݑu߬rTn<E>oy"Q4DG<%#_J.K9Fh1'Glq'.}py47sg!o z[Xǻzq"~d2FFF#G`Ad;>i=e`#Si7>!8Bu)eus%e1/je.̹K 96rs3zLs(6CLz@}~Hn}>[O#3FkhYl. ^b*Ugf#%J8Y]´kh88 %AG6`ߣ8*.w6gZ斒B㒙iJGp+O@pb  &N*BAKҽ?LxQ(,DT(ToUc!;bBdiLj |S+| Q (!x>O(]Cm!2  :ns=ˀBpg .A7t]qLjlF{MӃkh8 D+p̢(^[f2:"rRX|h,ICSv ׺5f"s`D]LqY 3#<:L^,c©;L.&-{ug3w8M0M,a`2g+mkr0:{0ާrJ2csܶ;KĻsv.؛48pL*A.fНƎխ`2Agz_)D;sd16^x0K`%9[p[U 1zwŝ痣q4.&^H:pflwԸj4/M6~8c0|ί0,[>:g.<|;W!v?o^l͛oo.ϭkG| Xg,vן^l;߾xgz՛?ۗvot+nM?ϓHd]woĩE\d35,]K\Sr\gyO"!>mn2$0&Dny a͕Z2"` Ip!MIDlZ ]QD3yI2aO0RJOd.<ˑ;+*zҗy@FW#]DQO14Q8b,3JYd^XƎ)hb( "/GgE$蝬cR*JQ>%$9+`D)N+ɪ]/!]@#2BL$^Eф7nEÌ?c >:FR״{`[Lj%UXNLFլ[$t`@xX|.^,|ŒP<#V"Ğw7jWR*YUwq.lKT9D@9!ZƝ2Ȝy(U>qK ٯ79ccEMPn]8l#Dj@~Z )\:-}cK9KHm洨QR:ͭyǿ ן>yp. mB}rEŀlpPdlRz"wTr_) 33Gc]Yb;S;Ls$y=/Z>ʝH0B"BA$a=N 8~5Yc a7("2 CYkHN1\V ɫ7?`+i5^Q*[ё5_bs3stO6SH:gVw t Ig)BR1J+C#VgAI\:C'N?uJī A(1Ss.աX͠fHI#B@?Zon+<|^';d8x aj'8RL.n,yP)DT6_0fi-!w62Hlf$7zڃ٘ Ѱc? ap lj&_.yɅ@o|'<`uEW4س)M$;SKE$O:(7;ţ J"4{p ܂5ON|4;0ά37(PIDա'i 4^>) f9#!*lIDY|RA{_`6) b \MEk &82F7'6izcsU?ӇŭƻgvA,6^7г㱎_LeA\f6h"u8" A\!5\0pPRX)6xո&I ]= n?lmo 9$y&?,qDq6Ts*2wW٬( v_Ǘ0JϾ?_UQ<ၵPRćpr%e)-7:YƟ5pB3QŢ#(} MX$^)WDž@F݅ =xv6#i$~vV%yNa8~ENE8(vS8Þ쟎7va0h 2pDEL3W])UfIo_gFݭƼE Y!pR O!c^LI!F!mЫE)5vɣ20+C3/; tNjјɡ="E)'^PF+Nx #NG( THE,`."H #_$aDH}ʂvWP?Ce WW'Ѡ?+2R \XS<$%֎ᙅ2[) &GeO@zū 4ǑUú0jXR=Y!GY zO^k[dϗ?Eap:\S$pwbUĶb2aGa.C -@O~'`h){;N6Ms<~WIn{|4Hibq`ؚmv.X}㭋JQ+~0/0Fa:w}X?.!^oiM)߾~ooɧ˛뫔W1xwkL}t6q5x7pvb+3-|Sb4/l0nSj-~˻+Wٷii5gJ ๪g+ǚb#h#*r|!8:ԾЏWٻ6c!);^jCUHrJUɓ˲/rhz%J= Abwq<–- \{nj7sFnPK4+6yz9kp=c.@&Z6 ͔4g!ѐ]R|qTS^ uAá T2U0~X z`Oδʧ( mɬ&L;82Oy9o~48K?"`yx0=7z(Sy۽Q{eE[ew4 _) HIrc*M144pG?qƿ~e'{9Co\#q۹D={<VCi,P$IN^zOST@+Nw4mI.Tc3ߍƓʈst𾊀|UB{0T[z^z^n$^go9г w3ysi?MZMd͇[ѽ޴6P1ðF7a|6>V0ACے!=؅a#p~).$k8$>(DH0)BQeSaJJ01ݪy3 ?[!Ȋ6{P©CV4Rd%!L'4I,w(j¸ Jl.Q 㕦ľ++vb55oQ~\f\tʺe{57;*u/h펧oyZK}JYwK9ZUL0Gh|beS{^\|WvˇxWcewѓ@!J0VJ 9d5P\h׷'2o #MI3\C-ɸ9S (JQt?z*H,% dr?Z OV"_tQȎmAGj6%Ɵ}ϨNGgXg)?(Dx>VxT+ eӟ}uXOwz+N-'7,RoNGoXE臲%|7Fy(#F'8~1l+`aBݠwFrZHUs^X[Ǽ6J%FF չA ۞n1aDUaaE #GuAAFEn]tAK+IL eL9 "h=WDEϥ`~T;jQWjRٝ NB<-Q9%4_N$.ww7>h:>Mq5D&|H8;ifПW[Yr1kO<P 0ơʘ%D|\F D{?bDPw:;]k{1ngxܻ4-x`sڛrx<8,QצMvߒ#ui= XqMໟo-V?ZFkzO~?"!?Ќ?5ByǕ,7Z>$~('>yz>U,l-/.Ib0=}5{rO痹%͋F,/Sf~(ߎ&?T^OiQU΅}~u>;~tugVG&eo._ <`HEI=3 LR9D}/r.KOXHTJk i%Q5 愞RH.輘%)lri Ԁ`6k[K4,_۹B4[{xmKŖw-dMjEkxH!7D. UVUe xymN5~}{sQ8oc>8y>8w&ǍMecj_ uɈB `huȕx8@Ԭ cї c ͐ 5c/ ڻ^C7L6v'S>Xl5}u ~M#xǁ2H2.tY5D5 j9K8\+1{]h"rdF^h˼.d$0v K!D)RB#-.tNg\j(P隢dPS^4T1̑3* 8ZkT.lHs#rC6dzǁ7l0st1Ar0j'w0OHKt%e %(aDwu\@WO"v-Gf5jOt:Tp Q Գ|z/PY^q SgB"\TUޒ`>bסNuh|F&5x~q}v` L5MM*eWBTlSkDEl Uҋe< @KzIMB\Hpk@IîwI =bLsec`B'DrDt|Z :LS݅5&%MO' @- y"^.ve)ϤUX :X%$STN斚VkwnVZZIo6Yβ vm p|ιuHI@5ɽ7 &ZCF%*2 w|Q_Bwaۿ5 I4? ćШP:ꚑsrp&>LOD*OE8X)+f\E9I}5Fв{yK +.kO4OY rS2/P¤;8amnM er+ηI =ё^J4ש-CH-?,݁l^ejj'Ck#h [=J,5>t}C9:\u~) ,%kόaa6yG-wxUFYe0ر`}L(ią[[M(ʃc,O6Ԭ+F?\sW *A,0@6TQ;5gTf\USr3j[G?{=7Vj[ #SL88~gS%bOHH).FJH{|&󻎾.F^$R1qt `c4*=ZCA$r0fs83ikfy;-7̹m 8.id 6Rl̗ܗ2s߫|I\Id.|(~6@J@$S $*IQviKq 0ΑANqػF$WJyE}XX7_DD 60}#I*^b`qyRU_<|{R9]Xa\ ְvLB:`1{GG(\"ztQ8CB0Fnjѭaw;+n P8Krzۅ0ƶ%Z.UcZjvΗjg51ko]XKiЦwO #5hɹ٥j€䦲R)1D NhExv7MF0aBeT|@ igu^Ox;H_ܯo/j_r õpjؑu9" +U{P9F:o:oO+pRD-1uCb4`3Ż/aJp(ބ/:y&dpnw!9j8Arh"E&-` 0݄uO 땘}j4ZkzE.5:qֺf=3yΌi->ezV ls6[~A%nZK-4м<κs6klkq)\[`$/Ou@ 15*@2ob ęxڣi Q*yЉ2rDIT^ĄFIC cNG"+S5cR̊UjB J\*+=4ƥEoIbʧoq$T U1%wTIJ 52]*UI[&+o09EUTI$8/3[ !QSQJM "_%qrڿV ѧgZXցMSُԩ'umqSD{b<^tu$]J_P3>_ x1{~yܹO'OusTû0wX~пL>^T -Km^*&ٮ:wù){Ƹ F17i 6H1.S1MMu "|s00if*m)5!JhlbSD ($R}F{4`I9<GWqh'Cc)+p5,6dWGCj; GSЂs0D2<%r<3K-=UN+2K\BqJ 7C-"0GڸaWϨ_[뉕9qȁN&*WᣱEHbBA*v\vm76d3.zQ.o ˀnS0;o6zMW/g9p7VM3Q*C{A-% ?Z/V#ҊNWkYK5Zv+"Y^gL$McA@÷O0]ǀ/yN 6k]_(?1}>fϏaܯ-1myw9i2򫗃7Ɠŏ /^<⓵ԣ١&%KaAԙ]M̩:؅d}3t4m[qe>s|Kh)Am[CM{U[ԹBY*Ev6W;\;dzR& W {~J{xɃuVk>QhwE=n$ ΡEt~˧5L 5~0tgH5Lqv`%#*NBB.}`\`L`j0׎V #| J|Tb :]?lwЧ v)TQj,5^1n]s yw8*5`W}(n=C&{>X<-kAnJ0 rtXC~59fO> BRbY3<+f>>[T?? #Z,G !jѮuDA9o7ҳUVQjiݥ.? %6tK* ThܝFb䦌JqMϠ &>8M ^SI ϴ1$P"^J69Ն{VJ[T;kR9*T3Q*AYJy~z?(K!4Q+g6 U:q: OЈ?+g" :eƓurInx#>W/WQ5Ի?-55-,gr` \߫nB5H-moi mF+o(ol(4k. Սy*w}"Q͚ޖOI}p!՗>/[7Fɶ8-^A(Fϯp (ZVC5 utd( HBOaQ GZQ\L8w;łIϭR)[ėTXj_⹦tbgY@ fL]2M)U's%˒- M 8> o07iMgx?SP;FTdC5;Bkrp>go9{t)/ax yl{||xi"Iwj蕆J>itǻ|j"Y<ݹwLDv ciO扯qroZk-_kTKczS4afO=SQ-*bj ,SƷj3"Pi?3>yKY7BEЀ&C 3T1*YΓ7κK(@5{wu婆B4FoE,0)U٠<+'j >̇FhFRj!HL6W=[/hAPMPX2;[0*I"ts)yœ)'h8&>5*B5-nmpnjt舧,j%R T РMDP0|jAθ44=ޜm7[BVEH%d/LJ~9 b 1AU ϳ_F@#Q*p蓩/6zӹU ڣ58VBc0D^3D%=RBs/40}P/cՁT7e`<=s^LI-"ޯo/)nϡ q?`^0 u9"  Uu@TK#G̘'W@%*10֋yˏ$qxP' {b0fׂȀ(Q TF@U0Ng\놦):Rt4JiTNuG e7Ӓש!4# ! +E6RG uײB$t7SC[֋#*gE8͜7j7߮ݬ7._\Vze Ùm͗dNHgW&_>faeӛiד۵hѮ5 K PJs٥Ejy"sKsb4UQ\3N/W,faLͽVf77ehԲ%Ȟ)o (3gJZj'ГPrRֲ_'qx/[/x" nk}eiن'pmҔ;b/ɛ%y'yS$y QIk~)QW+Nb=`$%vr_x50*ꙻޒk-vxt7,SJl ̭Hӷ*/c^Ùlۥ(S7%ut#tѡ۪TEٍllHy pFŶbtITsd<?R!܉ϵV=Fxj@@.thBd-sOjMRj$ےQJhJ==+h*?{M#"֫|%K+IXfPK6 lRGCDZfC9^N9'c:Ή/2缠Tax ʱj;?{7 $GҙV\.)`x.Z(KU "Յ (zeP`<NV|pǕ 7Q@_1czž.Ѣ/B ~|z`~2cO?/~$a,^=yna|W~F$wZc$ bŁb}BѰ/JmuC ]PU*2 F Smԟƞ~β i{Pj6fS ski)z5u_鹋n6 ]f~g"eQ:e*C@(J M"f%)n1wCF;D݇W#`>.N |(M$̺WF0>QQ:YlFPQ<>>u[@k;KuJL0@DJg]"Kqꃹ!/(ۆ={9!r.c-5m7D= ]ȥ װ 7,X< i/{<ڰ`AFM pT jX\!Qg[g."T4N%.v=6v }0*w$8ZmEIL(^4*eT$⟻ ZSRcƦ3¯WT'13X#“98زX g̥9SmP-u$at%'w$h^,fs<ɀ 3:`)w ^xx2#3?V'dhNHH NfZ!$3"1T1"0EVlo`fn)F1R#uئI))O^TgI[i6=V5"[ZsϦ"3 \w1 L ]X4Oskbpo"V<*Ìv^t vrl>L, 8HGި*޼,nܮhY'$xI42Z882RGJ+4J8/˫KrTr:=RߪBp#$,N|fO2j5N`JDYk@ܗKSJz<< ?F KNPjJ KQ]p?ZVea|'imo4IA9OSǏ8 7^C ܿoж{>=tfG6 ǻwNU(yT)AK3UڒN `<0V̭̽76IFğTƅh[ɽLE:Е|t3ݙUk,.mLr/kgݡyulτR%U[g-*@]띕96Wn}ݏ4 &w\Gk⇗C4(gx_@}0M{|D4eQ'P⌏@W#i$_?ߕd jk!J)bx$50Q1 [T+ը}1kĈ'i{]`aqc_mMRiNy1łZK݈4޾L)B^MȰ;4r(gDPq#c559,#oJf PXP9V\R-6ѽ [dGYk|!`}J/`ǯ/|J“hvӞ1^:}:YȺ8mg^ / V w*սq͗q$… `> _|i6\X^1eR3yH[! ְ ']`j\ aH&@i%:7%e)mѽ1yPcY}O/qL.b̤ Lt"-G>l|ɠC 1vr6.IZjm%NR<+6Ϋ[JHb㧏ªqY"X q~݅WAÞ1^m=Y[!:s^at| aR,aJY|"Ԅe %sܖ 8W^tGi;W[qqM8 }sX){oBRTHIޫLVN{ iV(uwkNnqk?^{}q8F;˧Uw) Ӛ;2_NQ7̓>רA%%gA/wR~ G/%>׉Շ% ݸcTwk,TgpytuyLm x3]ۣ_WϽ ,N\hY'z v jEIf Wgr$V\p*(jG@;ato9rD:"ƭFYL ^>W=ՙ,5zA wcWjl;l?<g.C~i]6wD່` *k˹|sŌFq aF4eRe@õj! Hˇ/_J5!|3 `B-x dM@qaMZ/6-{2ύTbn#_ɇ'~8R~ʱ8ʘs^.:˖bb5sHækMH>Vµ(/dYt]OOaǣo4K(J<KxCGA%VLt:,hdi"8 wdIjѽZgm|XwxhԵUBDZvB+xiD%UgXTl > .6 &xb219ȸ|앖OoI#MC8itP1_޳e׳0h;Eb[;)N-%])z].Qљ\ע䂩2\ƣܫ=8GA: _ /Qcw?& mm"Oٗ!ƹ E8RZ vTjG/4M%*MRzT s.PU^_UDȓg{A!W"]6jql$Bn sZ@QZ(F8q&v3ThWZj+1}m=|K[?Ofh/Z[*MN3梺g|<: 5dj3E3ĎC=^< bž,`-nrYSA?r+z5`YB+epN+,*vf|?!N}ny`Eܻ??Nی$QZHRiS YFpE|ҕPEL!\Db 26@F,#ebml %B^~T\̐yѢ+psw^q n^̙8(3X/S;C!Źϊ5UvW[C:kP3_529/VhvQ7) (kG^Bk{>enKto%P)-Ua>IvpkFvñ?vƣ Fy~?Ϥz*q:o2,N7'{X/ۑq SK29']kt]LZgxӍ`q:5ḾO^+l" y5;vJ+p \}K8 _g읳;R_.zuwiu9Lxd\%ܮl'shjmδ NM.^:JANg$gĂ5v }Yga.`k6 kGlKA&><[# },v'2QDG:}>} I#).#!A 9;4gp6PuIWC>NM2+ N1ҍHބ- kPJO.2^74|Ε^{DfwT-H]Z V֘uI_f{K}(B{bbg֎v'8%)tD"TŪ")Y%E e"HwѠs=xR^&Lj21'-Gwd(0eM1gd⤎Ⓡ2N%7ijdgϒ! jL,HPRB R =t؂%;% Dis `M2~p!@e {qmkI)AjnvZ+W E*yH` F" 5PLH[=PEh29}`R"";qGU9 x< &2$`:1[jh9.O4@N.<' g\$Se- hҋ) Z$Z,+3RAw8y)s])xȠ2*ܞ{N-X%z~)t9#Ȋ@S sUHM䂁CպFKnUkpo^&7nBN13\C`o_}2ZPy7 %O2x{M:qyߟgv3o=Oӟ58ʹmֲUr㱣pa3cގEd/w={t6"TIy|~A'яYD#q<əsAg0Nhvo X%fWp"6^@0Y#真5(gv Nlv Nav1F=z^->>:I@欶i5x{!H=mgvƅ(yE iL X qHƻbDr匬 ?t;n+L&t6MC4CM5~W0Yjw .8L0NQ ?簤ié)Yd6{!P)BLBp+QTZlLE(tSG1k@%hz~@wGf޲u-[ۻ*\[%SYkgv:ifJ[h=U0i;웹7*ۍ5ΪMWyԼy#Ұ?T<~;4*>1ր?iE?=_hR}p¶`lq P} XћIf̸Z=qtn>ҭϭL񽌺l!&,( չ3p7w099?<<"3R!E0fS}v=iXswWxgsWCwɢ<`-ynIq?wۻU "2Db?#N'qY\uS?.wf2iq];X?x>6vI$8\~swg̅BH<˻pnTiAc֑`e}׬vZ Hصr\8RԸ7Q&Z= ޗȤB=Y(. s }s֣G7Bwpl+TIypy9c0mR4{;T:`pS>m_WDD_`'jRN^;N ƤʼnakD!0!taόȸ!ƔG`2)$GQ̓B!0KsD$"g|DArEH ZF5H3BT3 1F*!oօ`tA豘2;i/ʅqgLWS&;PWNJ~&^As,ҝ>cr3uϖR*})OLaolf}i!b:[Rv:C ܛz%iaDžs݁1L6|VΝ\Dkٯ$[  [] ʈN>hZ ê}4V{e[h-*TU?WsKQHqBV2S*픻"9vQR5!!߹.SJbz~b4HL٨3nӑNrJoZ>z*|}ٌ_{բW锦TE߇^(J:y[EkRuU@b:7R7Y: ,&Ӈj6ET+!F_޼i[4Ոͮ u`HZɢ ԋ=_e@ȠZFWHp}ari~[L&]<.Y,*4T- TMjï>x N23#+99+~Zؠ)$@9!iZ4ckjSi Znvf.(BBw?'@*ml1[Q[df~Ňc~7ɱ3[Q}UEw)%X*2G5KG>dW 3%le|g?ø x؇e:w<*my`L:k a9` ;.<ٞ>kh4GѶ4`q[)<ky</ps#)UKJ|@g8aGDZT)-Ҕ#"}`R9'9_;•~Gs#nV;GT7/cu%ZB<[`߽-5Q ?WR߯`U-X_L u[Ƌv DQ@C:jBUfP}1D6ꋲ]dz=V5$i8_g[ 9(OW֗=h RinUia MQSƅWo43u8dq4p)ӴSqH)׈uYXw1``EzI:wF!1-"ٌ9kT$ÔR"R琰k9*hvP9i') /$s'oWPhŝbJuPљf 1hB"0i%eв<*$%A+fh0kh7Ŗv\hBUlƁ K& nВ%ђ)ڈ#pQ9ʴ/ڨ#9=@odQ *ੇ ߃5 - 9mdur 6{1@MݿlLC$kuZۯ~ھhLf Š(5ΐ`:)C#@<5Zx>X_?.;oSa%}ܩegq sw#3-RZD&Nof4a`i[(Pek8N3y}tDc3,cԑ,Ad1-/2띕zg1ւ)KV% S\%F`atԠJOj8kOd K$,V)8&_7VY)2&)4<Ø*_*a +(p%ѨedĄ`Hư* F7GJjix\-YjL9L3DaZ ՊEID0sFzm}4sEi^$+XE75{yMj:7c;MoxfkSB,:ؾ3˃ͼF#xOH?vs{ejmbSŘN^- c4M2ɢ%pWk- %C\Ly2FBF:>V]:⥊sQBߧ˜JiE}9ЦrD!sӹl: S`wX&],/PZh*=Lݨqzdt,-)k6 [$QAj('Z! lfuRH%jxa [ &O%ս&]-jvk{&)M;kKz@VKI'YClf,Yc"jjA we:y^EK[R|s`W' l259v88XJZQvm?'k]rwz윃][s6+~ۭV!Nt%jJllogHvM/Gȁ ]ۢOAI`cʐ-<*:,{ڸӅfM/W[H7؇w.ҞC$>?f ˞MLZͯX쾻ՇO׋-3~W|7RRRMV֤sOf2^~9jКd|X-+_>Į3~L K`hgUHg%geɉXf*$J$0 *(AuAK"]L L̕Чm0R+ +2_uT %LNTQK}W2eP,fW5 8NJB% \("Z2gt( oBQ%0^pKUFFG8J:(i)(Xr#T?E3ד&T pP aQ#A%bNb6!'?47yJJtsG pjJaKUZHz~RZ @ S6 .%,Z҄:U/LTƕ:itb$حbl?|J |^IO"պ ,#^.׏pcMUA\g6+/ޭs\ŭK? S!ӑec~ƻ>}ǚ8> ]/W$e[k>,?N׀9yOS^f ªuCOE4Gv2Uop׺)B'nN;Xy[DS[E4G䱾ۈuӲizB떋A侣u; yfݲ'j7.GXTT*{ݗvZヤzO(-%cnJ~`& SNؾe4ŗ tۨإk')C*z0@"M A NTnV*' >M>C`:(в,i(P mp^@*O'BnYIY&D_YEIq\A , w+3 Akj N40Á#$2pAPmEIZzo]6K'%4nnFoN%tWy: ){BM3:ĕ6805qy-&8ʒzmr8D)EP"IO 6rAxa[]=xZ?fAMMcp?ْnx ĒkwQpkIW}Gܛⶊ?XR+BvjbJ=l|3}vzM)(@~sh SIOg6L TfwD uvg ,oi1Ro ]Q{ټ]CR#т*a&#fXJ_mPP(q*̞G|բ̵"P)6K4dW|}^8bAc$V/ e#{agFXe_;=SNpFΠw`Z^ܸ2f{"jw^-7-8GPQSVtd%r \ e~ v%vM-FHJ/A^lJrδ}CJT)T: -?Buqqᦴv0g3-6>oȆXo~٨]!SiQvJ 0z~pҫ<1(fs´"ͅV[ )n5ژmΔ1>X(f4D꒎V,ri`JͱCMu"K-ujt[=^7f^8[-Dv*5BU4U P]p)9z`oVO6ARČׁB ~.1ÿg6Yw-]D[TRef0no*)*[[ƆYQ Coy!fmVz[% $h_ϙPIM%$Qe% (iՈDtYIv@(9iR38< j:娦"%2CyS/m\׿yËڝH *b~Ǜպ YwG_ۛj5'78guu^=q0$I %qw&~%_w2c%i'MPRw6J޶BtvIt%y򦤺`L(M i@FPޏ֖8g [:e|AF:\99W ,4U\+T8PRA,q@J1yYs}b]%n2\y%Sk^J֢@9mkM}@%FߤLvN1F8H<H'sP_'N"/px1ɘ0AB^5! `hKGLg$3GZ25qu&mf^DZܗG)ԬG)c rPj0KO4wLz0P,Zcfgpw2uWmHC:f8InX~߰Œt& $ 7qª ]l OiOMOӎƞ&=ZnP9GPv;GmM+?MD R-!֒n-c۞ъ,_O=jkQ/> &m;߭ylt| %cS\cUiR| ę"zJHSʼnlq%L)gssVq<{# ^jP88gg0GGD"bNDUWAÂRx (-Ke!_b-΃2AwAK1 CAoaw4ErҒ0c![ >+Ƨ vF9B J"vl64DZV8u&( WN㊕cZ(:d55 .Д+PEx" y*g u1$2@VFSPNBe<[lAYsZNaLkO2a@@+8΢=Q ן-`v}kQm@ƞ\6L!8}b7^ +Djsb[wb9[Ao}#> ܦA28pl.dq+K}IP*zyk/ ~VUYa! fpk*IC4Ug+ybIi gP~>`#t(Da s䥌 p j"C3kFkj=rB:,Ul5! [md7i=A olnGاuhW-=d ]#΍*gmҊ;%pIDhommu'P,hKX *0zSS-`fU9U\b<,&&l;wƆz}Li1I~9\ٱ?^ecCxuH<^ Tl}<Ղ%T{T*=!|ذ|z:.%xՌ>fxTۧ>h Z7CWܞӿmFu27Zz֢OaUcX' y"%SN&Y7&$[.).,GnDևq͒Z߻3@"Ն:Y{^6ڗ5& FI9pQ+Z9v5s.n7,lq{E)"mCgqH7Oi3\W)n.@ W (ֺϨͳkw,^;A6\C[1e<%6K`[v{K ByKz-[oemH< =4阞 *ǫrAH~ztn&*]8Cs4ّQw! L6Bx@O^5,NFDR;1)G5Cc21ZS_@EOƒo`_R8;ZɾvdfaGtݳa)I }"~{D7^QM. 7 OY7S:$v{\Q*2а蓓yHaV& r]<^" ߿i:im]FO/3=]?oG>RF L[NыY;/Sƨ_=|)#g *lu|#SƘ D'Q"6q.'fz I۞4=o؈"SCp[GzF >H^?=:?[砧Zǰ(H8_"q-9<\ʷuYp,$πP?{6LP;oUnf&q~r h"K=AI'$05(ln4,$9OCf#pm΂[D崐Oϙ*s Hul8F6WNjIl[ԉr6'׃WNje[eW|wyqd;jIHI+fk||̪<c;H+u aƮ0-3UcyYv5πUo0Ѩ5M6}mA悻"1.sD,M[S-$Ոԍj]'k$Qݯi"uP!Tj Nϝ}-%ȏHH wf# #,p0!샌2Pxի{全8eqQw/޼5^Ȭ%o7Εbo!>2,`0SZ|-9|xʫ5G\=aZK3^2 JvUF=8¨Q𦢬FOx3It6?RpЗ&o pGg >rA m1BsX0kU%_fۈ}yuzđ Qq.?F0vioV,9pշ?:%t$"Ó(DZ0ГQ8&17$BeVF"%R6"$8%Щ}0AH}ƺޕ8o!1>>.YuL EhpXF eo\Qf_}$w ѣ\j|-ŒZQU)6cYT4I $z*`JElyN-,?\RgQk:R g-O9uhp.U0V-SB$ČI튇SeOh [PD"4$33sFMf:~zy/x X0 nXiR! RE.yJ{;tW-Gd;d< +ɥ_L 0ģ.a>p]Aڿ&ݎnM- =_gN/M}:c =4hMh-W@oqi/Ȅi*D5+Q?6rm~dby1/cl1 t¹ 6$d΁3`j5{f 0q(QJL# ! &`hiF1Oc* ԍ{o6lW\-ެR$E"(Ka+`˂F.U(1;fX6Ŷ*wL洤`r:3C2)gv^aFR\hzEBm׍G㸼?qm-)(_&Ӟ+2{X%{=ZRϽΎ+>{^R2񋐽ª.!.Nng҇p49jE MC+.?T%aUŅy FYJ02Pa R+4̢$ЧFw"@0ء*,a{{qc+jRe2X1m Em0A*3û3V tnzĥfZOrͪ*}>+K^i %Kh,8둖Zt3W@WlbɿoʵdFu N -8RT-iR%ȨʗH?_yЬa0M B߮cQio7Vڂss/H+(JB 08D$a![4[-;~D:zl\M߾{}7wo~ |__{՛7߼:v}$ \iV+y4|FF |2Alj\niuK~d0{='h (8qt dEo֝KH 6ΰu~md4÷uֺx䜮-N.0rÜrpq:ÑKUҦm%Dw.X|<{}Q=tquS{4@_@Ħ5cc'B?509gۅN?a}&Ur<_[N@52d]FڛQ;hj< =r78xmRL 8%DzX:iUmhF[5j*m\v ;mV*Wܩ u!B"h^02.RhF:Ə<ݙ>u'zg r2Hnr;7>pdTV")v`6|=o_P1 5* s@>C92+?K+ 3?>C]] ??h2;ew0 ͔W7)LqSz$f`hL;g'Wԝ@Y`l 8orn Yb9`]"$Fo]9CדϨQ̨FWލ\ӴrFE%dG@vSlNdsÞjh;vKxNpb7$'ZG )<ab(٢r-K߫\_ڛ\ݽ EK&W~;m]؃Iju``DuH%PBnȶLh!%f $ˀ"v_^[z12)4RX-AbãXJ hOjU=1tOSt l/j5GO-Qck#0A ٿg 3̩ *0U'W_!/b'Ӝ4.=/hoS@R H'^WY+2j~O:+T1@O~kCw3dW# ߯/ p# 3<[NiR$d4?${?c;=ٗs_ז`iWuwy:=PV9a{JJq*<;ؖ>ɨ8H)C*\=m[UNxTqVQEE`RqU*&R~WTp%n$dt'| ZbUK{x>Vo9?g>:jjK#Ad5\O\֞TkZ7\yij=l8Wo1}qY`Y~J:Q2Brp)/jRVg8=>cR6l~y i^=URe,I(U^޿_.wia8DB,u4VHqKw̏y!;y{'`h*K'E+'?/=CXwGW#O?[^{*kTq)YW;6S^[{]1]E27h\gҲ">xcmsm9AY%Ol8'(Ho4 d7oF5xOsB>ȘFCQj8 [*$N'22)#LL)"r&`?hpыb/e[$&ADAT`cTژ#, U= =5zJJ%1DDX"MhE0FHK c+'5֩#^spqϣ~4K]3E̋=~v[ Yo!~~ j׳Y LM=-4Φ=\uo\׮|dOh [AƳ ]YƲ+L~^ElqXEilEJ%Z2J [WUUvս~~1i t?6IѢ&XupJ2RFP#FbŹPHk*{lyU[59\z^xXAaX+ #.ڽieL--;=2>sZ*ne̥͊5}h@7%xH:"agu.M5@0a3nЊzNrFsƒp\s1"P51):^?0̼ ëd/2{Y9}$0w{Vrʸk3g>"=m u`hQk LA V꬈%+ZУ:\/7UT;"L fI`r*5#pnH5*QD 94Xq2b]pRXI,0|4 QބBQih s~]8n'cH?y4M +Rzn ѤqLVҥ2S*cq\H)˭#{]>R4zg CS,bSo-<ʜRJ=˰w4>qd3cyQN'?ahvZ1Kn=vV`$XKg{8gjsKvR2&uP|$rW ~B4.w_1ϰ 3f^d5P!'v|/%-Ui*QLU`ŋR3Dow1',RɈ9#kq4"vF]+&J~VW ѻ1o0lҊ>QQyԪ -"&=؉ƠUJR* l/SC9d5st`7\~nBcfQmU)Nv!= qƆ?`,KS]mS5/8dio>!O}+\`vA8b\̐Z=1iu\iO0WD&j5'{;NPZu闓 :Fb]('KuZT,%x+A5:Y>EN?ET`b`?S^Szp9梄 a~p9fM %vRq )LUl Mz`!!AYM zqBws!Z+BqepR r$4Vb\j2&,\¾`8>},5HCM8`r~=LG,-P<-Ml8y%Vkpݜ7?(H8GX\sr*~#}˴Co,¸~fZ%sʂs!'JlwWN(Inť&6]3ˑ65FȐJf9R2k+[du#:%;-9GTb k"<(Foa&fE3mb٢ȪPF !c9)AbFQLW0%x3OUԀ^Ss#G}EVt!2KS{OA™{IXb!2%Q|ٕ_6@#οY' v\J.+PK)I !R*:*,բ:HEllB"0i%YTx$1zm1 VN :Li$ <]K9vVWcU3鼿Nla~\?ZI0r0I9.أ`mUȭO̻Y9!5U\*4T1H0Rg%ipJԝ*̂ÍW#{:eoMaOuY')?Nzd|{8H(xb0N0[2} *_it?/l{C?{2N.luS:쿽Gɳeo0OIM/|~{W?K<~׿ǴGᲫd3R7cM.mܬM/V6o*\4mZ)ky3}ӿc.2}M:ӸǟA*/YF攲zlBd~o'>fEօ3p0J `ɦk+йŒph7^0Fɰ79H7 |mՔz4Zx M.o? FA %^:o/ZI*-!YmZTrng׷oϳcbR,Wشp Mµ{#"}>~"_>{=skΖ2h&7>҇*JZ7aJOџY" +r 0.gxҽKoP<ݨꗸ1 q&u1j$\ ~KE'Ԣxk?z@jHuJIOVs=5}c[ְZoO_!5M>l O?t4%3&}lj> oif0󢒴m!gEgH )qpY#ʜ6쁏qe Gp=[l]6G \wNW>΃ QZbNt),8nG߱CݻJ=rYNbSl!M~k)\/xغX|U{i [K!e HA~H kC`&."0w>v"> \](4ŁcgF5(CT0tQǸ&Q˯K@߼0R .N!1`zD `~pcqpqͨ1n[;OB獗u;qf~il]ٻ6rԾ&MpCr޾te,IJ~qK.U W$i>93g..;'$+yh19Qi}lk ,.FPQ U:!aiy[5\LwצtGpNDG'Щ<.I<8ݩ8`7U={r]lYWE$B,'A[hr6ȐB\pϵrÙ>Y˜?E2BIo&0\_]<3=y߬$wjVNCF@K>]町&]軻iAl TFt!CF9A$=i8K^& \**&u$nSs) 4f,4Jz3%0!JK1դV87CYLa"F p6I:ж e1GCB+-Djoa0!n{E]elY$5ZHY8m( FjRl A[&xOm')*V[>~ȷ%ʜC_Azj8F$܍^0U7{yo>UdN`AYB4ZXmeyO`%UNf-ho0kYI|iq--r<AnW'Gft/) UK}c{{_G70ߝPHңq3\*yg]UV)G6A3yu%g邰eDZLxAVnm#sA oF֐ )68l9G'A؛ݾwһQQ$q'Ir?:uS.5;ݕC̠w n+GO, yQAo#Ӄ%l1Ve5([v8Y R}VP _/NNf1+t~jsC0 DmL@THf\ڤz2nQ ŐěI9%=>4 *8{"IK\z c::tĪaN $W )]}wTIZ%JD/kVo%|G!,]']!;2Ӗx=eGG 2B!BF9q#3@7&?,O??wL{h2e~Wo.2HQY0NU-B F!*/_P FJ`%jJJAKn[5(J+NkA//쭷:oPHgPNCApn2{$vmgӛI!Frx1.Ͻ_. /|u鷏]啞iםaPi}P J*QJ}\4q \6+1Uq9@!ݠ?yooܓ/2*o! ɽCT)b NjscU q]̥5uoN 3w*JcOz"֋+j`a*NS}b qڧoqu.X4RLjU0T J0l Y}Ӂ"jBRXa&9Jo7@JSa>n@3+jR"+TU =*Žitܻr;*Q0*-~}RUZ@@ Ed~B5Dp"$|qC}dKAWWS @^GQ} . 8Ũ/7+y/BT^(ĘL ULIwz,x`J bГ}k7l{ڨĮ,[n=O|%W$]t[Qg,*a{}*hPG,\l)m\(v`s&))3@m:?|sv?\. ixzfU|}ۏ߿3Vhͯŝ\|2M>_W P^ˍSTQgeY+Ҽ}Y]`]{0R 02^tn t.`!yg7uhCہqq{QjBDń0;"&6n~Q~^OTDTai3O'k'}[)΃EĘ`cRVfU(Jp^;lʽpsqeN`V6<:l!5$쓎ဩ?M%>>vi4%QU cV+-8H FpU(R;V:Bq*9/K?˝aeܤ-W.%Oi%W~|JKPC g&Y[ 2 ЖtQwk )ǔC~Kn+.W=C(SRV6#N)%Ф:L.4:6YZXH{)Voӵϋ`_Ox~qOi7u0<%87}z-Vebhղ~vqU۟>~p }w7C⦬+]l<")[l؟n:+.9LI%$Ogd!VEnۉ 鉗fz]>xQW~ZyAW/#7xT^~u~ow<@U3hyXq瓼7('\mMF ދl3"ipbD ܋EoIo9fE GťRE$Ja]TKw=+M_q\fSߜ*9t_>onǟx##Kşa^:-ȟx/mP:Ձ|oHs%!fOy\6sz˗g hO'=j*F; ⦦\|2"@(}H=|#)@ѢXpC35kE"UF ڂ5\JꊡV;;&HBǼfR6nnֵXmԵߦX=`)%B0WL*U(DQf Fˢ;uWo=Rl&B#ؽqk~ <&:јm)51M9h`C/TrS\:$TԈ0x9[@ʃ((?bvNZԁFBtO+e$;t>[ӄql|| =bNeǫ+8S8h4WJX5nvsӍy >RxK|' a.dﰞ '}u*,$=y>VZh<}{>Nz7i=V'IQ9 m!x^={[kPiIJ%:j]}shH2\4;.ₑ܎O6O!N= A1s+^r;*(磠X1MXBex}B<ԠG>6xOShg]\`:z@F>P Nd'B(*CV|$q4.GZMpQ^ XsIΕ1du%ki[SN]CMXV}O."i`̛]{<{_S}#\W+kL\I, JJɈ*1GuT9-^ޔz<.G8+~/]F|%߂X2],VTʂ!L 9GmU&`Yg;lQ_?E 2HsE- sEᄷ]z擱ǽbч5jf8Wlw' ߬X˜Ixoހc9^9#>S`\Fktq`sr2WW)~=8\<¸[Xx@\qkߪY9zx՛]KB*OVSO%q(*Hfr˷[~JP_S~ePVBvD:Pnt,5߇"ZH뱵 "S ~)8q^O a0uk^˯&DhdJ\ ET~nB F!*7LϫQsn)/<ᤢ y7q!RCɐׁ)k/J/=w2+:|b{)$ü&A8%wulmPu*APyb+2G}itud2ά'ne{H"W{x)X+Twl!3x7v ,?][o[G+_3ۀ3 & +Q(9 kHJxeR"ıxx._UWwiVUȁ d>I VWYA;2UY0K҇jFЬڋ#X_RZlubWys묾β=/K|;ɧwٷS_h@c{s&K\= g Ot=q& /FT{+|ζNKn cLt%,Yі{gx&xz/.ף;߻_Hks>#߼xګDß]͝A?|kזDF,(ο>uɂu]LxxmΓW? $p޽+>^ĝ&cN*9Hzl)i"3Z,Vzwa!ӏE]6nEL]^ 2]${p~8vɝa}fSCnrX|>slG<މ!~g\N|}䴸-^N_^2X}^sv''9FzylWqFCi3 7:Qc|ư cl(5_OXb~UȣlcrcZ;!k4GBfÞۻexBhD{VHgôE|0!P+=܁N}˨z7ttFoZ{L+~w6w_!mrV㮛ZHl.EQ)YT)%F)hMj{FJ{C gO~޼.VrOKS qxR=?u:6q0v~37peߣ˷0.dubA-m^9~܈kGŽ\x^: jow0f6*C+ߦT4{S}rZwUkb9܏ O(K59nwK9%5bI<+?uZAk~*|Z0^y` 7N.ޕJrbۣhz,a+q?8ߛ9UC>g`jP=9l`_;阧0OFE_MԞ{$a E*^ $w͕S1'O*B.Ha.d 9߯ppE=Hu? z/nĹK{73sp] Ln 3ާltґ_sz<)?y';o~Vf`}MuKɒ7{C_d>\ nx n။Hsv`IY~Ȏ(bFF)?C9R,[={ESVrn(<iCEo ym!/W mRs߇j5X?պ\\{5 Tqi񮇨G|o[wX繨ch HV8a.mwVq9:>:j` Zo*M?O)y|ǟGrC*UŸzקbܶF6|EɼAgO=ƺٵ6*s<:[i$XR,xFK P XnnX[E7inS/hsH?OeKo5&:2 Uԉ`DZ!0XV/y Vl8\Bz|VE[kJ_m>9ԤNJQ߁^ZHC. |yT|4FsbMq+?7$H׎l8~yD=&EER0ghU$Mĥ),{hJ4$M,2>$gT(9V]VIQ: \P+6%Yٛ7;⸱|͵*W+]"CZ].ʤ,8JU)NJX]Xjۑj{H͆ƴL|ܽA&9>2AS2:f}tDV`S#:Ob(g}H s,AنcuBB!YWmU da]}e$^`ErJZIRNJrSU\v9E$zv}e]JɅ$ިY=bd>AA< @,I+k1V:T#S]H( d%_\ -&q]֖CLEjѺmW],b.ʒp:_tQV(VU''6ED +@N:Jn _ef3 $#ӕ8sFD)Kwbo Ea+$qmEs A-!(ً93nBhYrl9[]ćަBUU"E,:^j^UXSCT#'TD\!9Tr?<^+X_V9Ġ\CR]|0ܩ6,EfTv2XjHBw2‰V%\X9rsI佊E m::="pNTP8E#S(zq>r/L:{EƊY)E'pi% &FQB4Y$rLF؄U1jݶ0wY=oMJ1!` "TׄɢjY1lM:$`t8sTCԕ̷|SfOŧ_BGeo^ j/bX5tE;7]☯p@\I:… CӹT$^'o@g eI./y3ux @p>\V<^4onӵZ5YjJL'@%jSQBSܓҀi%ϱ&h"::% `aJ5IcV"ޣ5 g-o= P1Zh~*>&YKbf-͖؝i>ӆJ"xPp8D<HL"fU7FrtK%yT{Nb5hW% '- a^ ,($lPKl12`y3Udm-JdqT0 4QZqp';wڬE=kVO |/V:ǯ. ,T6-g'S<vtp=tH9L )&a >LDK UͪÍ87NylxLXn b Ł,S$(<ۯ3~t ?7o::m$pqb۟> FKF%ANk!1EU!bی3@O1N)LmmT(rrY0OM_жQ.NG'g,Q><6nU -`:UGm1vɓbjMbYtE-fZA>.obV'2н̂t=,c eo|7 1y15 63hET+C掕w4ĕ̏\K6@@hT!get8]{ŃOzz9>]NZ:Y^eq @pEXE& |nh~+ x" /T9\yb]I&4ģT~.Jtt 0u'ޕ_[ iG}2(.VO(# !sC"{)uB#m odc2^,Z?dvog{Nvy iopUzo"L,[3=ئpl-@4! k-#N1}ԻΜSruܦs2&mIH"+V-m0>W7X<{qOH{ahqKTb6ٸ!_iv-NIJvJ*ZV!`d qq:9C?'g"PM!Y9Ȕ,ˣ4xxΎy /*27?~>^z`6ūLjv);ĸbWգ8ë1L|^! kyM.gwN|#&ښ 0ӷG/$Yc_Dv}oxi4h3 &{ƤX.* DJ>z`r-m w) - ުCU&YWv*#v1I=8/y%7Šr`CZӚcDKtXЎjXLjVЈ+ FU$)}\f{Шq6Tɕmܿ~ݙ@2!ɰgEu|׷l&|{YzTz_FZtJN/n<qQ=;9i7IJ+RsşCM'$XT^z/2R3.Nc)5nsTPk0BUEɻ"v$ڡ3I N"UYJ+$ &weqIzY,fg?gY"O6 ْ_$Ufgu6)@VeeWfD8m̒"ݍaRb9-C_ݎXzOLČ4t{ӄ2}&dǠ0R/rewOˮWZ‰}Z-ofcT w^6RZM(Of(c)JVeʚtV@D|7rM8]/2>2ArX=;UUhQ̇py?"Sv RmotlX!Ï>F pz0<td;^H+QaFfp`֓ҒI~JS4=cis Lwc^՘켭=-081/+ʅFy_*ašL#pm١x/e@N}5갠/*|;4/L [l+!~>Q{s&i"Ց<ݼ[K?"ȷɜ\})`Ε5WT{OOj} 38=կHOвV D!p`vO)D2.dzRiXJm4ًQuǕ&G/ԇp 8rqxGqEJϖNtD@aNȆQq=r 5ri+W􄡗ֺ]7tuCg:r(QO)h̄:ܐkC.Y$Jib+H-5z3TsMYi,b RIJğeBK:NlV(RX"dנh 3 al4[$]$+UEl@MMC-ӏ0;8盌$d0oW%sAA,ᔁZwi~}EdgY;^!# F {,KV2$Vv.C>n ADAR#O䒗Fd"^9Jo^2V$QA}'nYVg&)z^ bdpCeCVXd,8 ӏEucxɸh`y~??+ggŭ=+gg368 9U]@ X D" +X EVH 1Cc3F`VAsOߖg^oo_O;^!ӷؼӯhGz Xzw\j{cDI#B"r0lx@뭋uo\Gh+jK1򏋠*joᒜ侱؋e{ը~s|㓝VΌ)4{ ՞Q$A0z__U_Y==nHqz F)w?=ROԀ澜hl3fLJ(:"ՇhxˆyELJ0cgkb5[qO1Z${| #ZO*Cr*.NŮ"c>G=ě-x_1F~CI.!&JPD$t ,>@ϴOKrÿ>(kӍ^[75_y჆J 3.U|s*Q/,NW?˽$i\OݡH23+[ uIY Pg̎ˬ/]uOL>Fӳ ӘQ J*M]===g@v?n]Uh4Q$E171dvb!gÛ)cę5?)q@ku,W(Xc#Еup),"$"Q;̄@<).մyn8J3Wleքw=8aPJW\jx ΰPY`0C&-4Zq͡<>N憂3`qjZQg5Jnh64`gP% J!Gkde2#AM&@4$6yK&9בߋ6!zkX$S3!]TLЙr"=D#!Jkr dvto0|R!VqLī_ muw_}Dѝy\_Å+//?^/~6?bfc |x*r!?DH_"GSq"%Z#t6!2R]:Lo-ۂ 1}"=wuI"ٗ1!+=D"ALKل[5ǀ& !C!zt@G.v˺:50н ˈ1᪻1\j 1Y~jAylr}{1db[;Wu fWuXrP [ۡZ;0!Ԉk;,# u9c8JiU$spZ@Fg9eCPQ^b<$%ᴭ4z7:J7ZWZ2)/o\r "UX##h<dȐ;LʣzcY6si$惫Hyy-]~u:ZI`K ]^ y028 mͅMo%ؼ'sL>3vCe|:)zϗm:֋w74yqy?X`T@Ho2rXS99,y\b~X7 XtYbFdȱȥ2Τ 0(mRD>FK$.Bd:-re_ 6Т,H dNBp<3V.KD'yBvd(diO4ŹpdvG!)oW[/6Rˉy5j1P; }1xCl֬Cߧŏ"_B=9i@.5#a0+1)* NAjip. =Z_F^D^ƣێUuq0 o;JS4Q$g2$I<PD-\Mn99{n'}pv*~ }]/dV":t)eLqNL,޿%SAk$|pVf,jo|]~ 5NPaBџI>]wJO%ٷ(o\~̞<^ro.v>*kp@Ra͛;  U5ܒ4XexXn)N{ȾYz{v\ uO=^&-'A)uC4dŬ C1TzYoJ*YqV099 <3@N\Ny:kO܅9U {H@!s 0:YaFr%3M;c}1̓t"9('r|Z˧Oy*Y,[s80`z,f܊.̍N3#-%]\Bɵ}ij冄.I%T/N)kY#Y 2=24ˤM\!hjAIn0xΐmN2D04Ȓ6DU$ !C7\ 'eL2" .ȁ.Rl[+6{qtJ8gl~^~ c_,c;޹*2$%9O2zK:v U/$-ؙ 6S Qnm] f9gE6 uJm<҈.zsef.CCMgK"'Bw#4wLG"T${=qxv`9sZ<ߢ˼XlUOi:%FPԤTtgBv K2Z?(FQ̢$7b[͍?.eT͇dk뽼iRS Jdɫc R6Ih,<[jlx88ʼn(dY` Zx{Ϗ$zwY*^3 4D?\)ZIB Q%%nXc&8"HIH|Fv>6\&HX,:Hm)E5]A(-:)ѫPnAR}]/7!(/y(w G#o#MlUB>TG9EO;dĚ$ʂ9/͸F$p4Iw!6296]Ah,W՘m2wMmHZmUMD* UQGIt:<< jꐪbh=zfDy[q*PEc ]ݵ,T=ONۧ}EM!tFFl'<Ŗ})+[$`z %'+<Jʮ I(U$V AUuhr'.Ri}I8KAз&Qkw,MRgH3 眦(uz%a3di*QS-fPo^ZZc = !@hoKCڙKLرiδKb6*Z޻q3>԰)Tw)kEMPX27 Xȳ^nijyCISV@)B(Ķq!^T8,SDO<[0H,;Uj;F*x.t#TCwZšMЦZ4+9^SnFjw2d-މ,2NrC;Qd# #DPLIRQ 0'Ouٱ،[:{ɻQk4aokr!t 0|xӌ˾6{2W*\NQ%ؐAWE`sbs+OhZC/a|.kB.!H(ϳnτ(C:o6=(?NW nbsF(V7>ۇnDC;7=+8gS&{d!5HjYVp{NĺEfps2Sayc/1 A'gf36@5 [g 5fzT4r{]=ѫ1oZJIo۫tWWyuX޸VG30^onZ}?lDn@ֆ_拰B kio;_O30Gŋ~]١yAaA/-ru>hZ $o!/¦<gLpDVkkl7/SD= z*;:N#@֬1 GDMF4JC9c`Ҝ!DJ -3)bz0T2wDUQZ۞)4UN&@70D'Q)uR1]]O1Ju=?ir49|C9`~~k9bQʼn]/W󛞘y;bZS$P'h`<{²<3H96 IMU&alLZp)Fr>8``kǢh 7F(ބ\ Ǒ,%Q@CVaCS ak%X ($6ȉHidwK]8\_O;g<d "AȈ )bu2 S1B*6;j*A) dLzǃV!FZi%5 ((jjR$1jު tw=/M-ߣ 7;s 9X]<%J0#V j0 X MT01KsP?Rᚃx,1m[vhh~O?U~z#ØQ5.Zi UWv\*‘Ԭ~ZEF!z[t)ӣl]<\~~w/cD$C1 y sP Db irDA lZ^NF[O scJ!ݼ; YXbYDq5Ih@*_h F :mC2}`<:&H䄀1+{kQ#(+4d4.k묛L7nǒQ4KD>K;F[}7* `p"0Q:!q r4[!9G aiM-(JRԎC~@NC^ 8Gf#.W|qv|.BR+赜*Vr2>;\Vʰ蚰&yEpuvOݤQ~ )KvB"yoU-riwOA,}ɃL* h)p3xxKd[0.bSp+ *ˊUQ1JT9ob)1c73~/ak1+߮\&-~Ȯ($Ku lI4 wׄ75yOT 3q}߬VwƩ-"89dW†AipkŶ΄2e@@ - reUT;ma͍g0GiC1D.+@T6^'bW\ 9V"ZZaĻjY(p # PQFmT0r€ "&"C YH%nNLZhPv> |";tYտ[c=OkNq]ׇ~EuЖ-}xi" ҈:Ԇ&ntB#/dy\4fnQr,2v fCan8d~mzʋO3q?3ϏnӘ6nӘ6-Gճh:xd>ntyؗ`4o2`6g"@^F)ؼ"vpW`W:pE/$h:S YYmQT_ В EZGk i4 5Lrb[k?D>9+'\˯j@QQ)4vހT@SA[4@SA܉؊0+oa5 ilkUYjCФ?gIg}D$GtfUט>H2-Pe<1 iiۨD 4 mM4]"]p":iS3;xbBX!ç־5EF^FIڥ* UnG,l,e{q2cv9@hXrd~fQMVX vN {vN`u=ujs玡*Ixc1(GAb+\AU!>p iYZR."$*V)D56IЏJFuP9miNyy6TQ{֍I#]`k2JjKcIMRyd I<@17`Tpx|zIb%w 6qDw5 8R0Ǵ9ZBE͠cfUZrF>CmjYe+8RLT\`.)Z((r kSA/I>Q6Lw )g S)J={l ^<4e ?@%'s370I;3"&JWӧש;pMYgI>l4,Xf3Pi8Yf4X xE ) (L)E)dQ yLDlTd;R3*#2|ZIu @LFd=-KPdl ;&DŽhHɻ|ud͕o$^Zea6؁Fi0&AyXkpf632GUp1yM6c؂0TZ ҽ Y>}a`:P7s8;@eN.EzguDhչ (K;޿| ᕜ OTwB[F2LtR[tSKR4`C%ee㕼v@f~ϒ-qTl,Sτ,#|,F:.騾8Ԕ*l3m=39G^L;C9q#ʑb:I@Y+**XowH`Q]\rad?u Fۚyi;XGv7V:zK `n+82gKЄFfwQybl eW_~rQ}nsa W+8VUh[W"/|89_0>0D`۞9rd%{mһj%A\w :HN m`8t"q R݇b͵LZZ&kW?w)q½L:p,w%ƭ\ʭ'zӫO-γ&XʨΦrYC̫2gDə3{gq2ʜQn׋JPLګ)8ynNg/lbuW1=f0R{Hܢ32]7XH1hP׮\'>x9!(Pljٱ(pOrSL4jM &+i3ԙWNkI>(f`b0 # H- ̄X?3G+|1eyG@AJON;tLҀhS-"@fPBvP- S|6AĜwg' 859&w 4Ar>aF[6ÙpX~e'vs,r6Ҁ,5:?(?FaOLEC4OԽ*MDvLe1"*^R`$FI]%4n__z52! ԩ"xwӻ~lOF>nfKؗf*\]&9V׀P {) .|U+_nVމ(Pxח 6CMoxun;]QW{7IXB} "*8W?p&PIe˨^39[FkJ0j[X/Ċ%X7?s2 mu!KHBHpA)_GC@`T&+y2 ypEL]\+HN.MZgI"esS *QBF9d}2Le@ b2g/N_a3øx8VA!nuꌗS悦Gâ** ^!k310(Ew +8X5SL1dIT_X0$&y@`bozj-"w*9K'xcXјq%Ǚ"#n;Kd-ItU, NџjPs"<_@gǩًšQp,wG1TϹKr[!P(:,mtb6ܯ[gb"_Crq=ooQdC:vx Ì_'&"cJ?{Wɑ=fh w/4yu7kT%eT;,lAv` A~IZS쒤U )9T(wְɤ<5(e!<;2h:ʲ "J(eH ̱s3m5jjB,H^"5 s:rSH (R3v(HIaF3 B[[&9F ;cFie(Hm.0Dr<"DG%LB4p:?G 'Ӫmұi&`ƘmXO4(VOgq4V0^y`ydɺ1J[o$y_AylM:P&Hym ]vrm&+{̷RV{MO:S~jW XP,hە_N=Je\PUFKb ze0rtX6~Ww }1OQ"뗕Kk~1Y# VȮ:.?zLŮ"T_~1Uzv7IӮ.BÕo tTH2-Hґ- 1W-@"ͯU<߯q§$jyV01R^Zj9S9h=Zs‰oc/Sc]^VCoP(fxhYNP9yӕ玘*(7˸ɄW<ƹ:QJ%|~D§%#-i8ޚD4,]hDI G#k޵z:H8%^ro=&С9&ŹoS@ 1w)ζRf#_dZY/^&>ϼ.0y(*+e!EVoEa e5 sj/y Gb"T;QSE&=\ H~vmޘ!nv8hg07m'ch:NnԉíԻY1˕d~ +|j-c1rqhޒrmSVMzz5A4}G6duo֭@c[UNǹil֭)bT;X!wZͺ5hukCC^SH&nE݊xŽcD5ήe;X@ڞ^j ڑ2eG6?3`^oaWG`]ω D*DŽtߩ?M/uaЏ-]/7F@ʔxSNc39Qo4cy؏_?i찓ٴM>=rugQ/sp @ctw:zُkr"Ër:E!^<(зk |/ĕR, oeJr"&UeE e`άHN# OߛD:MI3^gO|6&/ _P6@qBW,iVTcۃ!m5o<$jj%wanU qNr*~7܁OTl¯/+!lO%`ΞJapU /HY?PɘN? 4-"tZ$ψMR( bS,Z6r@z$(Fkj$=8~" m7 d!uOJLr XЛI*$I{74m̅zr!Yu+MQV@ת4y=k΁-[hx4N(!llA+֘挡J2H"Hax@4 ZK)ySU=bY&LC袱or<(MqDLW &巡FV~Lԗ2u&v4.@gPûxi^'mnMsGt^-_1Ztw,+t$'U$)hL 儹"D˄<ļd"DQ\8W5m餤劵+#=#;tdQwikm^I bG8],?ހgsXNVvoyG~۞uf|$62[XM2+k dN-@VZ%46_-֔7W5BLM˩Ĉ|rIA #kID1xgߒ٠g|T{PGZV[#:hGA-]h*^NW9k?)v顎ZZ/չj|`9N!`w:erõ8aɐH-ܤԪ#J8|s.~KIO Φwd&̀UĀT_8* &Mh* %_])8H77ހ*VHcނqNPz,@Qݵ-+ŻrYx`OEt<$慇R?@YJ-0 eY,:\2u\ "#4}xcq5FBlol{PJP•T 8_̌V(TY6``5uv£zPޤYfy])ҕ /,"+TJ![ˌ9^+2Ֆ_E 4R%qRˍE~ߗ?O+Wńz/./nfwK]b^瓉䓙Wj/l"#Z|\l?|޷jNa fZGc}w|P&::E??%/٣|xoe2zϕ-G.憯nExpDi-ZAY>qCCCKkj"Nm@LH+Qthٔ7lY&a1dA}uOD$WS'>[W]ܰqޅr5r2lQx=ǩV\Ͽ[`iK~WOa,XbXC49ܖIz R-f+y=mYFe&ڰ9^ZB,C+[YE$ D_o<cޘr'A%B%9xo#\B=syn*yn45ι}udL5Mv5lOwcB{bTQ=#Z>j4fѺBB#A/8.j/ h"/uF$MkF]b;{*yFJVuBO :3yK0(#\ّ F+v.) %ڍi@tW"#{T4ir ,C} AT%sc%yB,^*j"ېҀqe1gsH YP$UXC4ӧ XK[;64I`NNC*4d.ܗ<Ę8+vƠc_iL giq.9xV#U4QV%VNbHlǃjEU6$t1^?oT攕kʱ F>*^U>>L>2tR'4!}(9f;w#+v8U[pwwS2%INO4-II`\y J\[i'2SHkrcIE.vR#NZL\-/apd@ 9GhѲ|ԅBز^Utzޢw?ƥΧܗڢ.5֪(uKVhUΎ'0Aѕ̔ш߲.\߲wdHiIc}]uAZ %UmYXzGv &xR!Nْ,OOJ̗C|˯~\"ܱ (ѯ|8A0o;U|;"I'--y|5Mﹺ VѶ2L* GJq>| OF_Tw 8Bză3 lm/[d S-;C?ġ $+dߠbܡ7Gg][ShAz}pCW|Ssƌ)%1JmHkCM ЏHCK ҥMҊ<v/`^yf| 'M*,Ai3.$Rƛhx9Y| 񔴀4TqR^qh hy:9֕&0Vkۚ؏}9ld}|5UjD# U4EQ1Q#J $須RJ\#{=`CwW&ڀo3(pq;?=iF+>٪*=Kӊ Ԣۋ. SLP $p 1x h ZN8(ff\,eF xoi\hnch{-P>$UKbIͣ*W$7߿IT6 vƘ(b1$abŘ@FƇxJŨ^δI*۷飖 M%cU 845QD %@ I =6F3˜A!9' (+cB۴y RG%2D7:R#`LEX,F H zY'++ jT6-g*iMP*VdaVӑ9I(L0nLPV/ ;-h,wlt)¡lOﳉ6ؚҖ\.P. 6*LJϿ~|v6opwY=; J5ZL/ A0Jհ2`RήUjshaW1iNwcF(MwFV|RTiæq4<ց\PWڣ<Ց6D!}Շ7R꾩x}J9eLu_TyTj LT*֬JTWC +__?Ź.] =8~Ff$5]4'.>͝> x忽Dt859 a=7+$W&}~$d;Тts[031oHi'!f@$0mtWʀ1#ǫ@ {n@36Vm6 \ HE `>Yr[FP%e -1|ZQgXJI `JAhgZ#]b aIK# \1O:@3܆ CCK={-ȣ,PwA&j8LׁF$$!:j# Ur 5sfΗ34ZsFףl .*:]ɻt=]~Zg=\0P/n2EPhSJDD, 3.D*8J &re<[jK`\a& t7>f{)v9]fR\5֫w|SNZ վ 3oԑSFLDs yup@T`=>$C3R+9ʈI\rpNjYBkBg| fH\$7#6h FE'Xx`ٺk`<@:JsU Q02abKa@‘QiIhABlUii=h@$xB vٴx\e):O ~`s*17Q' I)z%7|,DL;}EiM'ruu:&xd8'y,0~Q &XbVF0ELJO(I bGZV#JAa\A+WN, &0Q5p"ZU[mcu@p=9 3 qGG"A yi\.H>* f9PĔ{@b1\o\Fmal%? uB'0s܋hE$'Ѣ^Ȗmek8 sZfWUZJotj_)z]~;C..z,kC(5KRy*Jx P=uscG-3@L=!QǚW%J 6,Tկg50HW=IzV5Vc'LJrq |u:%_z0^fr05&I߄,h r.n>D4OSJb:"Wkdw!nu*RMFWpcfDzUb"Ls 44 jBNL] ӧ[.vt0_o! :Sq~wgn8W14o;8ZtUiWLJ Xj1&RFgJJ{hB5'Dt+l1RޤVH翾lD:ĻGIldpB(k (~J*\ ). :W2XYᣣ,6X+ d,:p CJT%_Sr0ҌAa&z3bKn69u @UT>% 0M 'K1Zl6"(!yR Pj6I } &˿K|8*i "l(pދmħɷ%]`[sjA{ܽ<YOk)8\!ok^U4{&ri9,\*e)~z>~w4Zsk-#vV'-⇸@ϑX1z h-B( bK)u#Y뜔^a5T1vrkz1A7/~&ҿ=80!æi=lz6=E#asGf-ev|BHîcgpp 8SK?R`.wƱ& TW@ESZμZx.B"[qgBK)-p'Jo0M4nar7A=,߯3 {9ulȲnKlWGE;,_,ӢsgԫWXkXK3GR7F06"m3+v S(n@{Ț̰]VJbcQ1J*`vz16l=J(8bl?2@%]O\iKp-JQR"OEGQoCVK^&yO0Uly(悐bև:͓9ҽp{50["oZϐ]i e`X&T eqzbrՈ!YNEcF$Wܘ5,,Dq?+,fW[4ya=G4R΄,q+*吶6 &]c06W|ƾ|Ќ[_cHtc_uZV*ǫ47~pgYu7sJ2ˊ?} o mv?A7MAЩLfY mYMY#k gJ7lg!)3- 2S4.gV_5OhZrKc HP ▜rwHk)+,6oP4߮ngzkt|cf(w<RͭUSr8E[y?8*/glXӉ.:xYxUqj ErSw( 1651["3)q"fdǟ9D J>>oә%ZQa`# ʵ}AmUvyK 5%]XG/V[ eL;>IEjc\H&NghͤC:C|co&kO1B)tf&KR7K+)g-&D# Նc@1τQ$oG,rdβQo5@oAzT;L∝O^_vT 3kه߮) gmOlhPtPî?K|fpY)WOJq'r"q `L-nV뱲+N幊[W8d;5)vkD*dcJKch DkS4cK+A2'lTu"6ji`\.ա.Q ^I`f H[zT`6j#USS)0QNaQ,;´ [os*TXa`40ypEsWb  G:|{$g`l(zxmpLA%"BEzŭ+id|$8x *MBes6J$kmU)SzK 58wGj-[ 5kJ =\d06|9CGJv1rT: 5m9_k &)mOg4( i7) 9{8$ˆ-lxĜDws8 9ҋ9~qT83F[Ĝ, "0 4(D&0{jT0)zS BHN>C'd;8X>ȑMbO=ߜM$w^^A$֙ҙ,tpp-{-7/5̱e}ڮa ѕ|dՙKLr0Ij%>` Ot>fGQQb-F2/b2mBM0Fb?{,@%tk\ikSU<^S~_!ʴ]W76čǚt3W;#Y(<~mz/%D\<&W1Ucrc2GqwxL^xW8b9whM9 hb%BP1pSSx  M )! +Mi1U!l6x u*-TI(1WLW*.EULKV"TUT@~Ug]㎗YF] Ɯ ,JW5w\_+^&M>h[aҎI%:?-븇Q[%$vNL $N =ac3-8 Q05{Њ {hCC3V,M/#kA5v< B Sl/o)`Όs6ѿPOieP !X/1scӣ˝&gT1ҟ 2$V93O ox@3WQ,o'qؘ&u 6&Elܨ>Z3 reH:~I4/ݻ1WI j0zxU1~kslQpdٯo.^}&nū_]$ԫŻo?օ s cϻ+n㘣n?q_yVaFDj̄"㪒WF-~<䳢EWI7Uj!)ґgJܹ0?|-mє»[-20>/L>eN2w G, 6lӄTJVkᄘ't@͘{fh]x{5Ͳ,IDj ΪN$ s-uCXK0yO )=ArC8-zwbBthS ٟ%@qLy:.7ҰPI1DjsZ0%ijBYų~ۥbo]֠Yju1Fb̕\̕\j2W%KyE7\|TjL 1U'2xK_TܼG3m#X[51B(]i eP#UsgMx(,꩛|^.J'X}}#H%?O޿{Q`Z?m }}F!Aƈ7B߿i2-6{ E?#^߹Xw(}x cխD_ݾ).1SH*\m30n> ^&."VVՄ,[$ 9Ĭ"Az؇KA!>ZG vӈTLޯZ!N{7<\eI!nn6K©[&fmϾ./1j-k˓Mr6yr'Y5.V_|(P{^or)m#-jx#atݔYT=1XOnp۠[Z]xi9QSpKɋM 'd#cOqS6iJNk`5` EWxL 3n fF=XfN3NRƆw?aĜԯGM )?gd2Ӽ-cRMMډi#?f j͕'[>ю&PP3BGI5U܀()-!rb !*xcfW*-ƈVj2 v.gPNzn/ɷnt (|PJq=0K)i9R\ -HܑbId>xi4vɌ=aĺVt@ l`KtZ $PD5p~)O8 *e $IevF/3zYwFD8Ir6z[lBn"ȀF˴#$gD8]Du \(Lފ"VoՊ0]y^>/bAGF!E.s'Bٚb>^D aߧU%هۅٗ[ӌM݂ Sǻr[yIBΤF;sA [%6aV{cPZP6uQ hU vG}mOK˶u{zeav+mk;,v"Unh4b',Jc)CnH؄qD+P}$~lY;#fioVO5M g5=T):T#2 -?E p L^FSįs6((ە3M4&;]c%?s oiBwF,W(t n3 -|J8%uq˧3psf Cjv_ ls\+;Ө!ph% N,svnO?=~J-'rm=9w%~ FLw7ģ ٟ(` H8Kujh% ut%n:G1d}^ Q܊|I0+ 9o"s)7LW?g cuXw3X i-)jz ex2qb*$gATafY46Z:BҒ5~G,?p,Wg-EsEc蓑Sf+½d0[[,Y0YQ[6LMha6ewQ4 Scz`Kaw% # Ϩ zˌY؜ԛ$ˉ&Y-W7 ḸWyd_vSx2^?NSg7ɞ,Z/j|U^\{l%]MeOH5_[yXd庘V3w~|2ѲwY6RE˖;ա2^Vb%wHFx?BJLX@ݚvµ sP$B[5KTh\I ͬd2Y7)yiT%%@M2S%<*KI 3͍8V9LGok 41z-4,3pJRQ{ cJ8T8#=$YӜk6A--#)(23jqrm&k(/x´ki3L?z_~,dFlZ'ą.Fs=ɗd>$_=F[MM/5y1qL[,/m|P|Ez+ʥ5E"Nʤ`qA(J#JB0W`d4Ѕti>DJyˤBK<)TyU!U41<1v2AiiCf*iys)!h%9d9-bxYy?CڈZqEQZkaerAa(r6tvh&ٶw`iunOLpE~@ &(Y&x ߮w)_ݷTN.%r1mvX<,O3o7<Ňιi.px)JU:Ȩ1rb1$Bƺ2ˬA>d XVՈklD >yبxc>Y)EECīEͤhΑCBlGmPg#{ȨI3dܸf2lb@3TTy[}L|y)"N*H{Vq(I mG4Oߤ.Y #E̬֐O [6 .Rgt^RusLB-jb\!NDeD7t ĄXo;ElQk!W&f6$C  5dY^b>Ypi=5ai:7ġmVAisTgs6LI1% H:vm-4LNK[=p)A[W=YmI>g`P^s@+QAQ8r~p_doMcB~]H[1=4d9U*|乫ݺ յW ٜa>h"y}R i}W\_śyNo=7?NxK'%є4aU!j%Bq3-0!m߭F|ij_ ޣך NonWjG[Gx^Q$Cd$/,lLP=/VQ>i~y) ZG8[^PYmuCZΤǕ_֨j;3S`br?\Ȅ !Xl<~|: Wfnt!_BLJ>9LmdZh_b{SB %):KmTt%y$GE5jI3&Cl4+j0%/m,j ŢYؤ}=vT"̧2,{`NB j RMqp6漷`{3G}KVyT&_ϱ8rG [Z 3c&4Tcf 98̥aLa8s dePɜ()"4(-*$wc*eKPƚ nNvbiؤK՗|6:ΆW⫗Hӝ, sjyOQaqcn?\\tY}?Q0kd`7QmFsgJNq7Cmƺ3FX'cfT;s^݅o%L+p.V(Vj3 iIocҿl NYx!sit:wﯻ#;NcZ;!᧳f:kU2U߬-JW,k6bj%ib ^bp( {8, (9_ʚaZaB&w($f8lG9EhQh1G3E!9$1zz ^;oMipLiv{2/~ϣ > m=;zfq2Lq.v sH?mRV$Z4dwVOf1z"kQjs(7~u u݈Hv['Ƿ!["lM*DwvG1@%I9(;=()AA;GڨɩA"pM# R%)h|?W N @9Ui۞21 $ QT=%S!ӫMCU'V(Ỉ?ҺʹU|QewVp3tv߿|-E@cҐ^*v]\MpBTp9AlQ!OcXrf~q2fpm4ixٹ/,yS=Ҡ <(Ns]b\ވQPFWO5Z2ޓF.J#h𱒿:tOuhB.^j5,˓!#<;䧿zay1j"k8ptrHv@7Kr|)[N.'BͰ"QdL.0<'8X-vD,#Np/ 1[ KYVd:e+I1]Oj0`Ht돨-'Hv'lIup43+0IʇA1GĤC~PLzP4)}(5u_ٳ~w7W7>N86 z Tl"g7ͩ<˘;&a!/W1]H K`%8P1ѱ`mpVQB5A5qHqaZ ֆJd \)){BK*W <Pݩz,=*ObhwmT)L-=qE?BC; nPVupxpe'bƏ:uv] [FX$LK|R;`KTVeFkc;.)hɥci4=^:Ҋ?5~@ED7galhNTNòhAo^9%Om*V,yd'ʍ.oO>= 1yw8=gFA;EW-<)%<4UNAForRיgbxC%W iڅKuއ!w#{_OV M'Z^@󢤷B;7bɷ"\21V :|`?ű~m#RJrLtjL.12- 8zFojQm麉TYByM3aQNF/JC*ѬȂSmsN[RBMk+`B(=B♗WoAR_MoQ v6G͋yjqeɈsP*Pq-lyǹC.=5\-Wv~%W]~,˦ϑ++I;$SZ9qtԒY3˙>PiO"j-nqfqKXˊ\^No}0=A3?(fʺZckDRˁoXP'2zx<]DA*;(߄^QSY>>ULX0ruN/)/7vQ+ԾVq!= Qʑ{Y(HBrm#Sch7`0ڭ-uD;htMW|n'ZWu!!_)vQ!;hQI*"Ds &#6oTcv K >Xc KQV@J"*&%5"H~B:jVp.-Vj[iuϧ5" 7 d}ދ!l{3 :'/Ƽu'Op[ah, ?/;q\>`s%` 䥓$HI-ъ^Lkֱ|ҐĀiIj߆jX|.+Qװl0Tt %laTn8YzZpW=y:0OZy ,"`A(`y4v{FlJƋ6-p-=zz8gp&;vN ҟWB?zJaԳVCM>̹͂CO?oq8&{*㞼H`::!ZLWl֧, 9rdҟ4/f8}2ٱ g E|p_+h-8yeRQ O}{"ŐCsB#t˙ NMl "|]J(ZH:.&ʲčI) ڤ#3,Jc  c <|qh8ج$v~jC?bP禼hr1.sl=Ѻ9v!!_-S(WB*zM&;iǾysXP+IؾӅiwߕݣvQ?|Ϸ7^A錽/t_FhB~,\cB̷{^7MSդq!6`w4dG\6ɸɲu}3Oy>?$BH}\m_r4'8-UC|F˟ȇ)ꚙ1()$@~A}ʐ!Z5Y7rl_pw;Lo?Z>8iI/UW 4KRQ+e8.9)!vՋOvb4t/~'/j1yx0:RqHp ".瞟|6kGN( Oh,S0A 4cuF`Z3>YJo1ǥ@ֿۿ~ri斮DF_CtfV &_lJD"2Jh™iі Q(sd-N "H#e gտgL9}i@ݬDkyw{7)"t@,X.,A[dCϕH1[߯*!^Oc->C,/ 嗟__3͗zd*w^x6[3DA"%!9kA`=Xcy q[U}r*ʙ sv4yX*R:%xT'4Rmd!X)_ .Jj0hȬEbg%DA͒t䃟Bx J!7ɢy7QDd;- R[(AHtPQ2 (+=1*괌g >ih`+N'.~s5(}u56lfu,M(6R&mG8xC U:ZjUPYo م"9&a>7FBm0&pМ+EH+EL'A"]cPtsuSƗ41IՖo_y_d2UkYvm$h!5o57 Pfl(%NAumL _ Z-!JA6BSO䅓z0 :b&#>0i O heg%$fq%.V<5NkEBOm8lG댢" (ϔ) [ HE$D-B`FS "N̈~c{Z=A'%$ `P٠Bs J~DG tjXh}4Q>;7lO4\+GuCT9sAu4x+{J\P.&ADP9E(>"Hjm-U) F6 [!JZh *FzIK Q;nw`"@k%0cs6^``B)k>}*"f^qȑVk5a}Zz9A¹ ß6ޡ$08c>*Coçl2B6ݠiTl(R$^$~ JT;gޟ>zq0>wybr78عǦ݄M1kПYQPo f(k!ډRMꀺ)b ݠA "-D]ho&7iqxO/8Sj%5=h{.z=x7{?/~eW_0jGwє} ӷ_>87GP閣%9I3g]1l1n%7Ժ<ôG5B M$.Ctwj3E1TfZ2?8 ~lQx( T"ꢨlV?SM\{[sG@.L?v}񖶳-s;;,><XxR5 >v7ͣF$CwJ4ׁJj^Tjl- AYCf44cYryA1\geA(a)\qXQ^V9i\8(S 4jHrg-Db *`8+{OnW4Hɫ=+m9Y@Z9cΩTs^s`XJEtOw!)G+>ݸdzj'TKZ7B*}Q?ŋYADSJbiDGŪ*ӭdjaQ7G0en(ˆKbm/ctBhGiD%M ڼ4,$UFTi'IF}k^nk%*)=V 7F!ƧSK/*Zou)EvtkA iEcA ogFGJ#^ ZHrJ;Ju'rf$ϵdIJdfP)|Q MʥC?tɓݮ]hKӌ.8)YVXI8NȐ,͎~܂'<fGsσpݼrИ*h2Fi`K2 u{ +kr;mTp_>d%GIF׊1hȡ=1{kEk;dh$fG78Î~6cmVO{TU!%Q(}snm[Ѯ02jylZH:Dfl7\>9haf Х6J<AQɦ-ˣ(G}|_U`@'R@V!I 9;Y!KbܢH sP](Ɣ"i:dL}+XPlAQl< _[P5yŗRN?zԹT0պ2D2Ę>J4ݦ Lk0pyZMlF͜`#G;)iLְ #hck/|U]-%>:X T7]ͩt ytvj7RSgIB.d{(ce/s o_`h7^9´]Ka| %S+29kC[ -g7Cn] n 3q_r Tt,d-1F4H;>0NC~x ׼nYŻl{vrDg29$r@*"yB1Z?}^0/TZ;el=XtG=8Xr&juBph䒌%-q4='u'cT*efAU<9i JZmǀeG ?:|QB Z~}I4vB:>>lA8s'tz*8LDGnt{H‰haup!|;SZLZ;Q8W=J=h#>yv R0gxZǒ2F𖛈 nZſ/~l u8eNE96}ѳH/>K?S}mJ5 *YB03(t9?^_˳UGWSZ?9gTsx AL`\OGCGa-}|/7(L>%E@/AC?Z+}znkNMŵJ44<fVA#u$IdB *[Ү)zmMvU}JrHJ9%L\ZJo_R: P] XVTm+~iTSK!7}RATT ̜P V*#-up&7੷MSoW)L\,g:j%$ (4#;O}(gt;AJk,N6TA)ᥤv6f.ETuOJ(xRJNn#ud)Qi+ҐDr =WAU91'&l,q6F˱H'<7Qձ"e*^[y܌2@%%ݡBi +Z pȐ𮌭.]&ݝT᠚j#!_WgO_kۆ`3zW DZ m+b܅ZrF^y`lA/}-n WoytTIWB]Adt[`=}S0&F՛h@/L@P6-o~K[yl@rRʛyB[R1j)TBo&d U&~\q +C 4S.gq֪ĤZQqZZI^6|QgO&Xe.ai3*q&y Q*e d$J8'1&T[/sJ}4]z1C\(%Շe-}\곺aKV%Rrgష arLFd6_8t$]t\Si2_er7߄ yO`mGsƅ持9}I͇qzJ%k<#ZeJ_%KqǒϕǯwdTU~Ps>N2v"W]kvq1~vsk\KJsАJ|R"% ,wY!L<35 3fe:"j6CŒONKQ@F|$}b4ܽ~(-n\i/hb?]]^Y5/W%fTokVvЇdqsyv9]L'j ?Ӡا 1ɇ%|L^ rN9TptkaG }X!ƁsSdӌ 8Hs͏ 2CkC~S.ĨA-$/ XUTR^MpaM֏uw\jnwɥy?{7oAO:q~ۭHYLL7>uNh=vx)Crc>q ;q|{2d~Lkj/Y4G_'cX"Ο+Q8kbx!huZ}tÍ$=-gELRSnu3/%8 uH|¸4BCu|ЏLY(.1)#t@U흒MQ4kL 9ߐVr-; hNkå +y!wY`[7)Vr2n0ݝkXZb'm±/%!ἸFa87[4G82)NμKTKe|V,O n%y͎yO$S|d;Oj Ղ%߇w? |OM=@/B\;9Nspa}̕R@Xydh*&[PYr= Zȭ*N'Yq0_?`QԩRhZ:h%m&#WAH>=G! 괼k$7EDeEX,0iB$hj,8B74!ŁK͜Fu4T_Q- ^S l3f  AZ$"Vp9q)d0D"-qNqse2@NaBfIT̡5N/M f2Qh )޳1uMנjL3K]鮞U>lԁxYOJ5|7_;NT)]q.ٛR)^}, 2 \׸K%W^4CTuAF>pϼ>QQsBلA{${S眀J8uݪR-)1\ncCv31P$PuϰUDzikczZ4LUOdB p(k#ء֜13 8MWhp ]]տzs-s5!c kV\iMY:5!@)CT@)ﯭeXc0ZôKCVIhp0< (} FҮDQD?_5;/{Wƍ K_v(Uk;UlRvv `뢷#)W ) 8"d9X/ÙtADu "nFRPVe]xHkT()Ҙ;TkTmqV K[+ޥG=V攔7veѬ@!kNQЬнF]#ZhEr!&"s N f\-ͮ$e>grcw⹿9ݒfq9ѻp Yq7?6{9Ǔ|,LWK-fb^.7{/Asi칏C̦80Gg q5P%KmIc~礊8"1h}$|y5_E5‡4&m]I*,d{M]O%B)Iˣ뀩 {)D2 uZY]{>>e!b5{[.H;l:|CMCÅ+: r2w3s>4.Ol-fB\O"Ҝ޸N,k1\ȕɡJA.^&.\~RN>EKc@tc?EvŘlB5 () & Cbf(Hq<cA4!&Y-C$UbȘ]{4- &OG0k/~iՏY.i/A{9M8%4'4ս+p-4ѨLr$!HsO4P!E0}Xu `D#ZmjVЁ첽,R*GT_pQH3:$bru#2ρc*)深z,d I=R O:9gAMYUyqo0k7g.):zM:qogt8Je($%︈7bS R=BTq,c3fK>~e vISS3$`xrIh YFwsE(ljWE} Ӗn'eEK)d%[Cׇe~5(@@kq`SV$x0$󠰦pƝ0#A%E:~ǀR`5sK+@*: 71 &&r7דxbx~̧E.ҠGrkpWd! ao7?I.X6o~Mï~:k1FtV+?z}t:-/z~߭cy/(PLt2hv݇tu9."Yw3 9ëC"7K"}}c\7NlςeOpYտ{hp9ꍙ~fǮM_w5En\H_QCr˫{ h\F9W=_908\쪨\̇ƒj1k15іR I[Qg]ػIзB IyjշCUfUyP5ʹCMê&/8JC΢X@,F]bnLW[7{úntb.Ǥф~ydr~1u-qD-W6_B'+{:x۱yo-g=Df$egWYnpBTN!'H+o-'1*셬R-'b>pDEA%?6Gz9ZamWZۥAΪoɴx"5**Mh^y%C&qŭ%KF*ė-Ec44&tFU>%R3|@\>|0g# 3zT X1ǘҺD) 5(:>A.M0n9`hTUqDb.8=$)υ 6IiY)`IEGXPCM׃ZO%BM9〩 `O$&y-8U,Sq%,w.'\a (ý# XCI M슩D(R:|G&ayKcDQ/qMD#M XJX[mTzZ )IjJ YXEO+C&x)C%iPFp^JZȯR6VI*hE1]xE'[HDN-V·j5T+Ym) {+W@Cqf5c* Jv/CBa˹O}c)ŏX]{}P'Y?1HC.ɕiuv{= A,k1h%(cEzO(M *F=EMrWz/z `dhQo= Lɧn^ݡ?f+):[S[ 0C\fS (K7UfIuEZeSqeV#dSʱb@[M&QmInWD"J)HBa:8yi\ʹ1Ko.z9n )[Fi1JJI-[NsWBlK֠f ^`cRq 1ߝVaYW'1] [~u.떜zV-.Tcܜ/0lvW<]+ezc}<)^`4X&L3QWAU1xEw{f\3o.=@3:\sm|FUg}}Ĝ{cW@\ҭڠo&WS ?jgaj^ͦD<6=eʪ` }&oqKpB:bzپ7;XwXnzMcgNt^(TػĐKBkn 7bfU] fw|򬛃t?J>8~(ni(a5V}f̪;EHsBtru;&q0M`4ISzrRaZV8N= fUTV5!T#)\WΘ}rU҇ǻMˢ\SiEZL4q9W9Bn#>$gj{ȑ_1ew+/E _n68fs21^˹dn1l[RImvb/9 K^/N/~^zS͜l[Bc˼E1@B@6{wy^uW\,OɺCYE к-|԰kTHk¯^\TE!PX; ,꼁J>m5Z')Kh*z|'PNƒӇZ ;q̐s GD[ShX[ak M/OǞ$w+y_]^\_4(1VOOn1a,QX{_xJN}Lj2:zogoW5ZO}}ZZ[9I*8FhL,F^xN2%7lx65520"5?_~Gȼf5pNsqj81_Ď#;"JΤ xaPK4$n1BZYL-;T`,N|7ݖ%KR@S)4R&ᤐP1&5pg0N&oAV4;xb$l%:Yq U1T ܡK?ϘTKz <(VFohA\\pJrɈh4,7(+Xۿh,ąe8p>:\Ai6κ_,}JܡWp˯(\MN:x|^j[h8LO˲(]QvL_omK;|&ZeS>nUޭө}Gv A1.'n-nCX7nU6%g[MV!CzJ LIP*1whwBq)~bۻ!EfZAԾw;uilVR6|&M)@)hG2&'{ Qsf4좺އyQ5W 3˶4oNyhugrLڪִJǝ:iaZqU&hoMFN~w(1T ˏyӼNcESbt(ɨ)dӇ5 pȇKkp/t[dLܡ9Ѝ!"1q*w!|IJ6,mR=Tr.E0V# !-o|1b\QUN|0W xb+5t'y?I,:FJ!i|D 9ktYS2wz^Ӻ[T'I%2 z0AG f:W,zqk6t{.5X sz:h,)\<ܯ! b%[9vIa "j 9r# 0FBbÔ(<,J>8mf`=8WmG%zf8Sbp縊Բ=S-3e;O I,ܫhj t+,Jxr;/rGm2 K[)y`WƟl(i{ ZUw2&+s~2BSC.kmn^x^j*+TpiQC%.ר ^xJ98dmI;8+OAh勃CnbGX< [սN lnIX2ޓ&+8$L2K>x+}` 1E B5&{L8U` fKFqrӘ&Z/ɇ mL[OgP~x*Iͥdq+|%f/~ WX!Z fx"KR&YL(om\AI ;40s"PC44Y4HH [PJ+o}F;0敩`$}TE:*3-hi~=;^]/:p O?zI2 J 3K!#3S'{7#|BP{[)Jk=R&JRN!"0 LJhkC" zk4BTp $c*uƉH>ix2.HJeT8Mi-98Y&Gc]S<_;iggsq±9zg_9ȧF/ ()h׸X6e "I~B0gLkfw.nkx/t il* 9gok'3< n=暲-88Z>P[c%aYcA4ŊnFj~D75A&\>^\^]o]a`btnֲkR5 LȭsN%pf3Ϭ,a#mCnrcYdY˗%lCBF &&/H7> \CAk?,{Znp'#\΅6pv?/,~YTȆq`[{ ʲF H3J4KJĖlCl%0b(ͲY.xpR+ed:؈^5=Oo24ð>0OG=루؀Wn1a T_$7* Bf|dkSN烈3'cwܹ@Fgx2>^oFJ%d FʮSݤ/?wCxE!ˀ{JңPӗC5rp+qfNﲿ$oR Ad?zmb>%m<=u䒍y Z%.V_-{[e`DlR/Lj|2u({;r)NJbIe!tå'Wb tRFPϧ]=}S\<\ѕШFD{*{O74Gb B*}*܋%|J C[AuqrȮ ~glLpッ3if;*8\ݚ& k 3`V 3 h j9XC<ΐE5xJrΦ܎!ҚV6l/Qͦlaɜ:R7yn]K 4y/::6b92R*2q:TleϕBjhVU3|$$lzBGVY(=Oۜ?ą:$]\5[9$I8qrXse#U i~mY"DIH `Úek\XMrP&ZtBȚAvf(]+%SB ApBcI|8S>בSK)V!Q5>\^ ٞ 1 1JrEIS75C1i|⡑2`d&X'AVhKx-yX JGկBv筛<_s7XĠ~C G5y%ىonjQ=?3չ>Sr%-y߮ZGOu[n̮Շӆx1ۺp{|oos[1>M\;[ҥ|Y1рˌ>߭ku}׵L+S !21zDaC*qRb{|O??.1~uK㵟:q`p (Qx;.Fw7.B7_UuMK` ˱eY3CPvs/43~c+Ш+xQM"\Ǻr^tbuP AU=7*4zNAbR="ٻj?m%<.= yNxs{Nx]Hf&m"P#sHFX4OȊ &$Xվ*}JfʋӅRh:]ZL1Ȳ[u>S;cX2R: Ed#/lx 2/c$5{A־Jq2d;yM̧?uQ "(%#ْF%2CdFMB#TP悏ŌZXCVآ1Zq)RVFr"k 7{Y/=Ylq:9WY BȚSVg,iTB.%ZjO9.(1{YMa oWd4!VXqƴƦa;%llyQH E`06t(E,a/c#ωd]@.*9mlwv5 lq-dM8WdkWF;0*#u40&ע:)1r8MӭG=~)5MtuC#b Y.#45}mv2?RԽw&}X5+If_%HdPH:Kb8h4F+R*7떛7=曫Tks[;3?IҜed^#z:`j_RVnR),KR!RHb,hDfg<lOR\=JB:`Ywg|u`D4vStq='kzӹhY|ZAK8I|~6 9hx`N`$"LhH/gQ!Ck(vl,'T74 #4p[AQ1/T`$Ӡ{㬗f ĎUmXU٪>VmWզtU-Cf;kktXϏ;b[hT`a %k%U;WSA5ġt}Щ:x{еet%\;p?3c' $qWQ{;_DE{N6 cbH^&X*@D³TZ`r9{|܎FA}HKL#&r u}ǻvы /(%@|9P1K*PsH0@AR W ,,sj+ D{^񑺉3-V0{_:_j^xE[%}f}RWu4%s 0!v?sez/@d [l/J$v^C/ߠ *4^Bz)gxuA]VʾTK13khxaܷPnY$Q1[ `Hi\aV_Ih*-IHJ%-S.u2δ6WBxѻRE X+ԂQ.:d;`$BgI2!zdJHLM 6BV¨k!^BNiR| Xw:8 Xٔ! CC_. ]w]4w!RNicCs܌uᢏɐɖ=}Ea" /[gC:V]mtyy>׫rQu\ 8_XM퍍dY{ UF^I>)b/A_STY d!Dl"jšwRbb:mnehjC̻W vn!,䕛M6f&An8MmzKk3:y7b-r)8P(znom[gF֥͟ѻWn۔ юa/[ V.`8on^͒_}FFΗԚf ϣ12_fF~2H"h1߾LtQBT/V6X`T_Ч e10_F%`; cA~{RC`#h@/j%m:w"5G!Ⱦ~pdʄ^]NIvn{savnSZLǞXHr{O [3npͭjaV J00!ӒpeHEKA( Ӣ(Đb3 "+`*JX DF"0;@ ;M%3_Z9.\œhE*")ae+3, #Zd }a#GR(TE A2li*ڷYNТG(0aH9QywuGLt<(ఙ1`#g cF̼W耖+/ =W xH+ tĕ߃1"=i+:*tivS!;01hbtl 5 ¢c 1BOȃO>%}%#B铉cGИ2LԄ RGOȾ G&SD%맜zT9b"LM j,KRRH*lBo%*':T@E ˁ^q$9,g^6+DJ3C}7ÈR@ԉDIT;%jfv׋zyNaynLV= wzђYb#evI9\-饄*H>l%Qs#B +;_@00Jʓ4G(!+ H~6g! EBP_Oh1mAE频#m$@LK*94/V^0=o1Cf &^!{N*|SX ߴ3!L 9͘y8^S .P[y?e_G-ERLp]~zl5IE>NW?͙jxX|1)C< h{\!K3& J2 \0F`)B)yFTXpESY \36ǁ^T\YBrAQ*0IکLګ^ucX(9[4&KEv .(+[ S?z25qZ=r:sFѧEnfl:ӓQ9e29힆Tk\_{P->0c^ 6I1d=B"+AUƀ_[h k;6 3jOPȷ@no A?6 /#x>lqk|9PLo B"bUL).> @d-νox3~[<@} "}*;+5V3se6~W}EöjCęI   Z@㹇^ mIYDgWɓA"-⹌.N Ss0(z2HI<3(zal^:Ar"]02k>OFG=m~|(T:.:k\nf+욨 [_%TXl*lV {A[F̻ާjKM$]4+/t2F3ts0e՝{S(}kH N'ݘ p@Ad;#ćhځ>6>y(E!Z!_+<@G{w *"?0 8.!&g>Ӄ61Gm1S"N?uf[ gO!=[jXwtqX#,{CjƩW*HaT~_&̧o-FW~`a>ٹƙ Z}}oѻ|,Vt|WWYˁ[U>:A+X>~WŬ%N_ox([;TN 9ج60% *p3+5^.Bӓ={VPµvse=oUwm̢V POZ#?1ރ^nR֍m|L_IJ]|Ŕe%iB2I!#\NM}nl;?l/\4za@P|P_vΙ|7-R0/_?82$-?Wןt]6ѩr;υ4:.zf[ ]rZ2$-JP"T@ $Ӿ!U0%K%RKvP ]8%鸬5+$TD&Df49*d < 7rJ y%^TJ,\qPPgA)eX^RR 7KtW)BczTfP`2,"!hreT[<ʜRe"2-QMr>c#-$,ycFHdmHe5uɘMB{W3& ;uǖ1k|q!?$Va1c8Ҍg:PeSsqYIi Gs=t,&t6ҦW1vEܣowuA9o'pZ(Y2p h:q9Fnd (Cq'5P-5%:2Ьdõ%$;o~qe;xYD .OiN@icȐs A{Lf9ٗt܀޾yxbVe`+}>P_طy!- |rkO+n\ÔD!'Kpg0" #R%z Y  6蹛z!1p4.OljcS&k~sE6уFBxG}mF o\z=ݜ%Dn7";BAۍkWO4,FB&5t&}wt%/1gLgVQt3tdm\ vGNO:>ܟDxOɝmF x!V]gmi]s}}::073/o?/~҇4?{W۸EE[D%|HtZ &Eghbz4 ?RrlٖmZl@'MSpѝ6wcfP*֚q R Ԥ$jdf`ZAα loKzRmw$LR-e+nU)@ 4q6 ŀ ]HE +P<#ʁ/rŷ>"QI"]zm"5'3iWoYֲAeB &epy}{!3]\*` &_`е{cd&]nhw?C=&eƧ>#V=GkZZL5CWunq/q~Nn^W;B %Tox..{nrQM'a-u?xN)IaJCT=L|9U LW U.*6hX;"D7[ZuS/[+He;0t CC i+ k&&R=%~Ć֎œO@A.Vapd8yV;OP'hwoƒw%bZF6g.M9Pq7 Ew6$bv`kIE?VVzOAz!wC ֋E!e r[;k1(FYEYbmQ )m!U@sHa~k2m #,kUf%𘶵ʾ2-ubu.O~NUѤLS^&/ rj.F[: >sG F:T)2 Jz)5.69p+c؝H%tQY}m,]6K^tK0zrL ޥX 9Kn<ֶҐaFJ_o8/. gE/ka.N#񕢌ib#g9Jl̓AeBq3:Q(j1k G wZtyVAB]Z (=׬uF8^5T!jc?E'5խP(5f6|a8Y}~vATĥ` R^amDav$.gx<0c+M @W|zaXEN..xm"#"XWʞCtPЪѳh<}`&>7΍, FuK'<ĵ=hA;fL?[tn/) ᓛ} ba#K=3ARIb[R_gU&&,Np4C U.~T(7Eqk|ڨC{I\$p\l"v-}j -KK Yޱ~,!8Iɮ'I?q?dl;ʡ,%[Wl|9$({|49t5zPd0Sv>;PD͒w6QߓM74heiIVPyKFª3vܪlƒ|U E `gޠ"lM}[RhۛZ H>ysH!7 L>2|v "Eb\Az !X!6 ayG=hV{ᵃe>P(,|hPKzر u%w6uߥ\“,%߽}9=:B3NO^E'"=P\mata $P<NI,`w/G^t$IAe9/,[G m_?=뢭<|am*8vQw dcK¥$t unѷ!(X(fLZ6gk %w M %ola!^nK:9*dBzu\}DR~2f{I`:+sc潲 =y[Z47 5#f-ƯlFWK~t܈ y~;#3mcV6m9G-d VQ8` 9l s(3vGy*&U/6 4y3zgY9vz' Oo^VEVڒW^J[j)Lb,)!{6 ,%3=wlQي"aU_c2O%_^rAhJM{O˧ffŬү8,vDFutJsbl?T쉺G^Cơ?Gs;l5z>@Ԋmͅ Y+k}lK̦D gn~ H6롨pT(HQ͆b( AnoTHX~HvT*.H/l}8_'T'_,a_܃2vb icgVѼ?{7W/=v=|nհٸ!nlgkuSSd-u]9n9&~Ki;iE:Tqqw0:s~)ζ՛A QwG_Fs3cFќimY]; !'-e-ڐ:uGAsX[sF.ha"a#%984dqMߑ"p/L/ c`&܍&g_ 9VswZxՏ]-w׉8v||<YndQzT@\51\Ɖѷ?MMS3<?֜^~0~$|;=Tv2?7WWg?8=m9#Yw^s1~??>ꏳ?vAŞ˨(s@rڡdP.wv5=Jx1Gh 7/QmFB*ۻuc LrTNxuRFN{>=.'h҉ggt Oj"@X6a}fZ(s&-k禛hX$]tvmb@ퟖ*2 {'Z))j@0w=LWĪ2T ߎtbOK(di&]}>^v/zӵΦGXxO= YZQg%;mWUTHF'5[%Zz.r_gWWvsY6h?Qy1W蟛)k8ꝅ5?D~>#R^2I?g=)Y:yeڝ>k$݈{z&%m}1gYɴowCx>h`|;<̧hCl w11צob^n T%$dz4D{x|QEw*b݋nO=Lf)0BTwYWB !&9ǾpjI;m[fwS%%QrhF'.ć10ZEh p:q#`ڑ?(?EL׋tS'~m[s|*@L1-9=Z׋,pP@Qy Ll>%˲&&㪸NOr7o5GIxӧ :9؈xSe/kQG!B̷R?p(h"qz \<|zdkmbSW9)\/ls„"&8vCDR_I-[{\1mlЍ>$X~tz5 N_ow:C4M526(-ʤE]o1IZ.7N&-J/S:M "ZLHC64Pq{>z?ϑZUO*Q[DՆdK9sWK H|9O{)#עP^rmť]j1!I|WXJNzeF[*H%~ސP8 (f4ߒ.ܽQDŽRAoGt$0I^z+?ͮBk-:mdYqJ]Ο[%[i3 B$.t@$20SjG'20pXEWC QNzmp`-t)cV${v5}Pᇏ`F5.hB^zoׇ1>7KLNLhkkPʤ`"xjRhNzePR<,P%*np<4:i' YųHgϢx@ti:2֧*zʱHʱHʱHʱ* h&ztLG3u~|'%/^1=YJ~wSs6>?(^O%/J%FWM9e%Z듀[dKcB0o>R,Ǹ1 "ܛ>妆 F*szFuD%wIbXbJZXvIYfhKGIG!.@qYqLY16535 @%> xn>{?)WJ1XkL~&S4\* |zʶ~RL"+*m]ݿ)4E 5?Rx ZR7%^ߛyauy(ct}2].6%便 eʰnr5{Z1"rYB*9nKYoϮW_Uppk9Uv b9ĘI*;gh4ģWƴTcDgp@UD\0rƣ ʔaB $Z_._Ə2r&M?)dQٖY@@VL37s2'ҴϷmAn܂"b^mXuZOv"$#m|b(GRJ֋ 4ɜ,0V"j *~:ywQ#&7o"%#gZ] AS *Xs%u1H*uxEwt=/-cNQJcMWð38t7jh+}!`Нezߊ^`5cUޏ367rTIȻ|3yJ (( ՍD!ª\jsR2P6L O2W &_ocB.TA_\\,ϳY/.h+FJl,5s㰣:0k)?L>Ds8G!O MJX쏬kH9DmسVl1aO+{*OMY^_vR2) xMpp}Ԯ&)#t]n{' 1glW o! 5#<,YdN  We,RYpTZb m|~HjAv|WC|ɸ90@[IU ( \ $Z(0ΰ6ZG %awG ܌%hBTˑyK9w*2Sa$T+*:rF0f)XDu)x[< Kr(P'S֗xt8W6a]K p.ivȡa.Hj)4 i(E~ jjбu1pod1V@ 2ֆ ('1 M#!pHy?1)ހRonTXqvt`+>kiv[EKYW/q]v+)^YPI;IYk16[bC"'! 3^ME-TS~֌jS8)&ْ%k:#7:iϝ*\-??Ӂ ]Im&  lSd)#5x%NI5+< ]igB?$p_|~'wьI:\T+Cͬ_ɫVndüKnaStҦ⿵kqN.u>:^/xX'?\vC^coM:2RMFմV 33Z1R}E0䓻ʟg:!lirR`O鼱(AOBC0iB"V)׷i}d.D:4^M_{yЫcdڻu8~1a*=V~tjgӨ?&j*LGTPe{;|Bg?ow=/ǠQr0-xGR댆q5(cR NGQ]dK<VeE}m춳Q= Fs'(3c 0贰 ke*cw`T: 5Zu"9ƠRRY)Lɉ.U)|-l.5p!89LjaX!X=C8+T="G C(zCztBҵ`-`)5k\T̲7~CZ讫)]+2ފ|&5BΠyA.bz#g-sij.-b"65eGS{,R뀓:l{Hƥ-24ėymoBEMΆ5C0adp)=`TAYI+*252$L B)Rc8DV4zFzXk|r/}qlا ۻ3]{ak-ov^ά3!X2&i51p#Kj J9:Uऀh'RD.I˦-B P:1kU "䦜wS?357K*^#5WLSCUxZRd̥& ٩ɑJ*6SiJr 4d:KndIRAq{IRQ>n'ǭ(Uxh)|O靔WI _|81&cB [mX]T4L=վЍL* NxqN,&8e{Amu19'.i.5>V(zH@$1 ~-2mcr|QC7C[D􅈒`,<-ti苇;ѫag   |:x?EgR JJϝLFCج{4)ZrotN\ҖOc~>"A M<?oVdoRߟ7)/y>Gyђ|m~!kq~peCs.pN ^/)d^V;X~Q2Hdjp˯3$&_g `hϯXh fS鵪&];K (3=j&8̾+ ],r'2ـ1Tb]_/_tJWbj F 'kZPRR RŒRe5S[[i,J /h1 ^%|rS4ӏ Y#+bÀ?l"%8_r_)D!U)jH΃#j8SztE`7JqګJqګjvu7L`>i lC '!*"VK'QZͩ J9>VWWѾqr7N}ЫrR݈!{- DeQ>s*-ϗojHnc6I4&4rE+C=FB.H#<׫d_}Jj_A0a=#+ 3)/`P=KaR gkj>h1\0:R6/Rs/P"[bd98kCE5괗I-V! Mv9yY ^^[k\u>S5&#yaBl9QbqNIgDi|M^% f-zh:AУ&M5YH9&@ Fe'Z)5ÀLWHj&ʪKHsDkGT.ȟoyhB?N9𝼉8Iʝ=\(x[h"ΊKkn ևr-KRa{iP1i }YBTcJ:i5O)[D"D|&ǜ?ɉT;Vc(4j(Q 7ޚOiԙҘtFhw}g.F1)P1cD ah Ec'$%Ȍ5ܽ0}xA}.P؈Cdi0B\cڰSFewFP TW]u618\SR޽R_iǔʀJ07Be5`19a!mRD)qpρJ:8 ܺKۘP^8.": +q@5 %PHdj$& B&+IpJkew@j%g'#u72S7ת)Ȑts.1nr9prg<,@@2A6LX^oHm*/W-ZVZo[PIG$́}e(Φ )Y_l}qyA~TzނpzXCr;sOP83vE+@.+!f=RI;r A7 =>F [c |(X00ܹ"`TqSyca!4+-9.nDQ%yoQ'\܂~ @~[733ߍ)o4qo@|]{R.kSIdZ: ٿ9ᜌb Y4)Ipjf8BM#)f;*57} 2èf|JJg;KNXr D>Xׄc)%)Pr@ܔT%G \{҃=iγppL\>xwJrSC.$;|ΜBl <`fo2""̖ܒal]! Xw_ǣoGxT.>Zy+稄:tNj:2M}yj,=mHՆ??~r[&ˎTt㰅\DQ  Bux1l&,W($ӘX}`y=a(3F,9a7W NQ92R$K`__\~6˛\=2apsճKe*$$uVIV'APmzyilnh) i札҈pPǞ#Z'~Hsb ."պ`ؚBaAU>_39 H$]1@I 0,)4\R0V5U=y[L5Dix, ZjF`R umG/7oH |#e9X'=Y,t,( g%n~[x:?rhE,<3"hyF( E{C,Y8?7VH=! #Ҋ!d'a>ol%4M{q֐+ݣ89&SO9?ө9q; 7i\s*c[e.ޗ1t\qS q˫e=]e9 6՜'oeTn&Uo:hթݻe~ 0z=?ʄs?Í_RgSCk >\2~vn).?h ts? d8)#z8Z"ަswA)2>Ҳa~V.CIȟ\D);3cc{ع. X1TSJV5WuU7vWp@`߸PBՂ;w(Zj5>Ѿ9g|ҝ"ӊ9nF2yWC,TRfA 1' Gahox0辑BЀF^FHPiTs9wSO3_h8CrrL19QiriA?#|yU(Uri䉥Tpc cY16}}yLh8{|h([.d@r"*:~M5ŕ 30)lnilBB"zLiB*)3Ip5Y}gp1{#gΉ3ϏW ιGͧ׾|BR\!q~w O"+&wgw*W8| X`~đEG4o|`a¤dW*5w܍9R :k6S׊O߶!Nc:lh8'do޾mD&ۡLUzխ3˱ ubܤf铍I O y"yLEp{\V r7?>,Ff>\ߥ!Zq>_ )+8K[>hԖCѾzh8Fl"Z۱rq:72om:SY OXZbrꍒL;Cݽ /-PWW0o[֨PI{9ppc{ߗeԦKDt~_KS'H\U=xMýKc'|8M:?$f7垾I˝^OgEUCRF͌ئ_)#eSF L)G]\KV6/3P"h0w}v6 -O:D,:U-uӞ*+ ??ӌyiOlKvrp<(3(a[Vy6 گq bA 8׈j!3ǒ)Nrk@"RЎʳ?EVN LI3Ѝ!Hn֫gG>ߔ6;sLsM ^$xmUhcvgdo]H<oD7=݅OM(ۅgpm>܈vD/񺓝0sǕTp;n%\.T&߱gm.ɤ\UJDuV3VP ,d)ULS Ў[H F ʿ:D%cڑ fDp@^be B{Z2"EJE$-U<\6lmb;iI~z(!Z}SBs->+LsVe,Cs!LU ˟ ohϖV9/hO I4ڙݿ\3 coz:Y۲϶.MM{1itrl+v=&hq5ˣ9q^Kmto]s&wDb.@ `;ʼn=y+|B;*kC&2 /Ç9:v z86Fw|cA ~ޯ/'wN.ֶݼ1ّ0ʰ|p9gI/GJ$95tf~p󛋻p Hw_lp_z`!j~ \z8|Um r#4o!%7eʀ^\egyA NM4Cd:isC7>yz 2=AGZahO:7__|X,hVeF;4G_ c&dƒL`"9xЌf?⻰_7ˀ~}wv1YII xV)q ~]xe~]SK$g4)z,rb{kW*ǜʍq8TrT0/u3Z0-i0KX%aϷKj6ۿěw\\?w_ v=}?-jK=@Dnrg웥]}c8PEyzmQL5YؾlڦYIX_U90?΋uvG¹jm#ȴ)g<T(iu&<3<.)/׷N /g ' 'P)e"DYScy0hlnшX  t8H")P 1"fa|Y}q%D`㾃T)yJH[,9؇i2C=c.grH&MgܐH Y_Q0gFpjVJ2_Fr4e;efD w$Ļy{%Ru\#L W >FRlbr [8rˢpj ͍}L(\5~EKT{"s +&j%0 9$u5'1^%ѢNp: $0xFjWNQu~ Uli֤Zd^njaHܻ֝GޮWk|_Xu4EP!9q%;?wTOsw$#4nܔ+M-)bNALn#-I[PcTOlW\:^jY&r)2 :96D|MvG9Δf %(Us9{~gfCo}W͔#UXwˋmå&D!cO/'Y~O^NM\`%ֻcd9.0[|c -|n+'It3CtJj볳=WqU\uL/ *|NE46]pGAZsi%؜P!ɬ`]_ԉO-2:l P h}2#N' Ci5gάJm~m>ĻG4iNt DokikOo<«mG{=5Ѩ$@9-ytpEGN#LKk3*-ŁXG)BI3#%䍷82GqΞx_*L>v=L=qTt W&o-6/3qaU[!ha}&QeyR<^%WNS޿=ЏJ1|8x . Usנ ʇ5 y~#"2>#SY_u;yǃnRڨ){3 gRP%ҊFc"57Lj-?±z'&nj)P3ȫlmNE;viSio 9uU ͵ X,Q 2M&&Ϸ.m%=|Ž\)k!4 >T@=_H̓|ӃOz1iQ"cZ{I6NJ6hBTJt(U+6>" sD:쟞npNIQݳN'yC\Gx! =W_ZZ@nE ʢͩ TLnJ+a':10\ OZstmtA}gG$+fhY/H)jasф+#f ^:G$ӹeS2X8wgAe?) {%AbBW0f_"39K]DlrQ¬2N)5B J!e -(DL0 B|(jV08&iiWpP趰 GBuL UR,P*\1۲3om#[)Hoa ik\q=xr8J|9PܯW{ЩDT<RZci${+N:A,:3r$>ElvRkAFE gф{Жso/<C;]h II@^˼$@>pF,G?Ga]c=Si0^U _ jnS @RH&0 hki.Cc# K!>Q9vqw׏D,1$Z[He0c ^ʰBq[E`pFzU+(ZBl>=: 㼮p֐̄q'ݧq K'2N4SlJ]~Ɖ|J}G:B=|!NyX$5&XDR-1uB `n.(YMDߢu|F1TJA -TBM'` f9vl0@%dB i8!8f3 M0b@dBH.e|Τ"f|Fq3̓.D> "չQ;ʾ3\{kOdd&0l~e8i${HIg:(r +ơ 9xVhry +,Nӥ~{ rA@7n tܱg2Y}OXf9~?-ٮ0x?c$]ADnr㛥]}#8>E"mQL5YMQvmtdU^U'Om .\f10!iD g\mitR^:m=J?G@ڋRNYRNir>FHcX1Y#v̇e$Z3\`d7)D$cԐ+}2bO33ʂ$5DV a*C%5,5}2-f$=CC_1nvN~ V z.PJJe(o?Mu+|}l׿r nR,o'.#frM/6A>)@?{ƍ?)҇6uINSb"Z~!% ߘ'G'V9E_7 twk(mx~4 ]3?oBȵ<ѹ/f_Gq1;F4& iD]-5G|nWb$k&#bî{R#bF2ՈJGVP2X9w7o0*J{~,Qw( ~麣~2K/9z ~q@T`gd0n/1}LVֿ67ËH`|<ߌ'tuTOQ=] LGa]٣xÄC3ǴaY1#y$e;*c?ye%Q-mw_@&d{UZi8k03~vk?5mrԌwl2H }3p}XN!>4S9_Q5Sh\]6P__w86 /pHW%w+{{yspw8{OPqsA.. iLw7`ZPJԏwv= G},^^/yǿ|{u˿~O_ y0vr;Y|wX ~1|X57}<yEXdpÎi ?\o}(5x+9svye8sfpLԥF)/4`%`3J( D8fRL&[R,%+{HVAX|0'CDBEf|@xr섲&NXꦍ )\y4Tymk_˘Zd5JŌzVv Tcv_$LB:ذl0`^VhYV.6K'gɓ#hv!lsא#[.ӠO! ä2 MnB7R&G HSMa>ڂ(͹$X- ;Cf[# V7#}"W0b@F +10F{/Ȟ-,dD:ﲵysv]|mq&!ܶlQgm -_ZkC9 Y'!Om!M7fp-M9tcF xsTqcqӻ>9CseMM`Yh2Y#:Ufsė8ަwYxal @^#O#n/oua(AfB[aej<k4NZI @/2SifԚۜVWѶtȹt~3Gr,w6FSմx>&SJ;3A>~?V~ikV$hHR'ɕcƙ(V,%uF+;AR`"Y4I[ *Ee8XɆթfeasUa?̍X,0 VGW^;.1Erta즏Y~ 캬`}ӒuAJA9zM<t<ݖD] :Ku>}5h,\}kcgMjKYRܣ?E IZ2U({RKvP*ڭ.)Sw2ڭCnIZȶ[MnMHZ2UDŹSڭ&ISsR^vhvk@B޸V). L љ}7:_^Wmo*Zw"QNo132RĨC4x.200a+F!Sc0/yW5$ƀ:8]I4H )n3m$D|^ի֏׷f X(t,j {\fE>_w^RHE}2'SE<73G0wC!:zy-<3@?lXAFH1fy* J5VzdX 4B8F\#9LWh_lUl=A B_ݼO~;LFab >,V3璊Lk~L:kCk= D1RMaEe:5h(櫸F c}0_%47 \0W4w/)=6ߊMQ}1^o៣i~`,=oޫɞ|qv_ks9̈YFJÏ58<Y]eNt߉g L),Uu6P68!daQ&g 6(J$kթf!Z1 @Y̡  -SC!aqؗZX `gܸCF &Ɉ8NgN)!"6 H5Qq =vM|cXwHHi770(큔jj&09K髖RҤq,R_]qY's,\;#@KQkLVj0@LX3QhX&ւL=m7.G^PMFn F7i'QQ"A#x .SΐÑ滑KBmF(O5aJ+<3u 䎇ݐ? N1!6'lnCD)=4c!.KP kޝ۔0JM%e6MCﳘP@4(a`=Rd/hZEDe$) ~HARi# QY$s9+ b֫ ĤR/%>eWTt֧[UƔ+R(![q,W/}c9(9B'I8sk#Ge⼐5R(+S-fTVj)a[0˅Ri7J Պ'Az/VHq"dxHrzJ5 '|w&TkkVi*q?ie?l\ۘc7WTYgp,iJ3:2ԋȟ$9B_WL9ݿ+@K^M* L0V]%ΪAgհg^C^+O+O#X%:H.&iVbK_C_L}P0Dqw*S{])[4R`$ 9C,\ӷ>t\LZ]< >b4!n߸i1NZ:"y|A[qqFJfv:Ew;gn2NQe͝;gN*r5QF &רW/l ϛ/%&>#KtH&YV!lg18)x&5ccċZ sD]*H@-B @꘿U\8p}j3|Bq&7Ⱥ;mS0:;SmR?g@PxON)[4iN[G!Mk1{<ʸv8rR<]Hކ},r]uF w16ƇO?,w?}5pتk^;oU``Dѣ s6좝W"R\q^p\5xˬ9}Gk1'(ELfH+Lj5{Q+$ici. *pccN-V[;J%&N;Cd=:*A^3JJa-b& mA:jL.czH_!2;STއ?ښcw@CҘeыY"#KÐ-EDFDfd`Xjff"Fwxl mhY6qo2GovJ tͷ[sgL\Oqb\(dw;seу7Wwm{baHj{DՉzZv*t;옅"![{Lnl!>yV= KDR3}YS߉V|K.o%ڎ,# L{d(gK$, kŜwd"4Q/Ҕ~ӔKPhEK;FT=d۟ws!t_'K7=V~J 0C!+-b@bHMi-x)/t0 զV$;k;Җ$_:-H$YHSpWXs B;t۝*Lv:F]XA"f60! -PF\0~ڪ8-\68_63[+qD"Q4`mE^:tA6{ ]W) E.A[>iW} B[FqՑlN"uUOeDv=bE5kOdk%;vNb/ּXVݵz76|Uyv G?~颓\D>͵Z<ȟb3|<0ƴV.S䷈bra6~ug16Ny<2(6PTg{_|pSYgysKQZ[Qͨu!4@G>{4%R j0,͠nsSSX0 O;(KGiCOZVO[55B6F%ם7AD!MY)HS$Yi5]Yt$s_ 81L@Y,A^j,{%23.=\D e{ G>-zIԗ|E]։>]cE8X. gAR0@1J'&Fg*C }a4 2F\hb^.FTK*%AVxz,- ^ Ac)%,UH0t/=G ˯'?j8@ _F 1<.k3g Y07Y%'NDlmߌ:![ C[4|5Gf/{Bҡ+\I1?e7aR,<5Xa eY hf-z$9fB4| 5SDK*弴yZ[ TTz{WߍZ:{o).̹y ;AfADCȠ*Q-jN-6HyKc a`w 䝹7,cX,~v^In,ѤтwJt؎I{b%71m@jhsj16Z!m*W+R&k|fMbC"`?՝B5wl-8kEsjv-24uma3G3.Mui0BIދW})^ՒS- b+" xEo5vTÊx\ۤ5-$32?}|tIm l9+P=!XxI-mKw2'kVFq~^de c% eoY%Jn$tqlY{p+04<|MQxwN\aLvBbSuS2G+i!#u؝flRZ56Ǥ~Кqb }L0 W0E&(]WkbW3׳jĢWWLUZd?XmQ>@ʁ_A<8\2'q>'fXaR,JDBɭ-qQ\xYZS*#Qml\^}՛֐;7w5E^C B( m`TM}UɍkU\@AD**>_.wRe{e64 c b [j  I#ъ" 'k[`+m2Sj_2cʚW [#fAJm%\q )^*MLO+] zw׎,l ô -P5DB!F,#&DgIV|,v#iQFΰ L˸K̈$Ҏb Ք DP$ 7:5'ՏR}->Х ?*œuVBI: s #[>s+jHg!…rj Ko9FB%(ۃL%(i,AIz85A+AI(GqNa$ en""Zh]Ne"+OֵK甽*F}RhYJ'5+q iQK)&Rd:GmIT/ x*Ѯ`Z9Sox5l]zv/2XIb"bڅ , !5^3D4,P83eybly"JT}a.H3qpCX bT#%@<0DrPnU Hg f/ NUS(en?\Qaq!ng6վxw?>rqE RUǗsIGҭOmh֜6ˮc.RJ2/s,+)X=ׅ}j!aچnq+0W̩@4 _`}, /Q `cVFX_zpY?9ce_A ?;*,Gڀ.[;]߉S(Я7~'=3\݂KKpN424P >K&{{쟡/| ݘ]vfݞtW5/Ij-AS\aMQ*1<"M(, HZvdZujR޳ nCowBׁ%Ii;^P ׆䵠ڰG/N$*ȋLN {Rc;{?~ZKsw>9`O|RZhWï$a${WTO4;'.5#iY ;Cs"uS0}@ ~8tG2N&jy- )T4iፈyKcUTskn |rO<:GXc]Ybک}p hGY-WPN]X fzc},Zo҆F5vNi>WU)a M(A+}m&T-`B}1\W|s4.@Rp*( RRENS˜,J}lI6V02H+GMaU9j5:w-.LQm0/ 4B'$(32B&8w. }<{x8SX`!~ SFFXdf^2PZ #!|, R3V`Է$e D-y%CFBObNh%0'>l`>e/1O! ,(Z蒊Bbf1:AWWTv۩r:0)c_i7) ]O\s#σVѮ/bQ,_-TyxS,js4kjYԴUYԤWJMvy#*v΅, oITzH{2YsыGw.%4(D7GU==RS]]o]'dm.n|8b2cAh8;$շ]&!&E_w{|iCyܸPQh+@::-Hǘ`NkU%Jve)` P(i"4<*zT0 ϕ\"aC2a;3WKJZ2o LqQPyh!8^;s8Hǥ|9B|3:\V6&7XaV0K#FT]T'.ElB;x) y\ sۍ/-;'sMq" 7pJu'wFbzxƊ={sժ>^T5!a&I>yN&m1< +$Ũfn")yyEff1O*wZ./vU##"ӮwTP$C>^?͡՛坽) f  p0yT aIX ZCG=Z@ 1pZӐ %Ԓ` OCab!77IE09iPWCap9S+ODcFXb6ҥjN)jו/ W>ua;bZ N'5 7ɥFi#ڐuӐ֓&j)S {^5PPnMmH RKI?4ZhxQ0hfpߖ3 ,. AMQ1 6jBiɏaɋ>ih`:ӬCvO찎aKN>ul6o5nBnߦiWgCj/>^!.oZX햨׽Ho]z3:3Oְg7}ֹhrw$b2cκuT{^t~<_ T?tpK}u]~%zKo"2n@w-nuIS=UaaCԖ'zu|ual6̆dN";wv-eMj/NN `8;*B!@iu9mTOqs jً r!S WMg^+ZM{WL ְ-^RbGN_l!w  BXkY~c 5>}RMwA?Ϸ]?v|~ތ/>HDp+5bv4+`ivIѹdu1O5lsPneNs_A>\AW.L]M?Vq'ѸO]nVeV$#oPE1{^ح]|Bl'.ϯ޷VéAݖHf6^FxK.Q=5%s=uK>H>ulUj)RfK)JfH)^s`Z"4oSH&4!ۊۂb_]-b z_s1)rm,:FږkR(CV{139Vİ lٳ5܎5k$]h+>5JJ/s.M?^BrSSzApN60}q?~ O_ogB#|ׁT< 맑470黇~aL Ͽ>8Ayrh]d˫>:^0 =󛃳f驎'xy{o<:A냻zއpƫxpa~jxbwZ<'sSYwvە&_ne1HĂ=A\}_JvX9oIΘVEK@;oy˺8st1 )Zp|5_fKC<j_xa &Zr+:]9hv[Ƿ:]WOK)mTB(Rv%b{åmE߯R a^9YW|wx]Gcx.Ϊo!SQ9ǟ~zsՆ_oC&${_gˈ&nS3V k(RЅD ZkJRSkGccՖ ܳqp Wٹ kgˎj~f* e,XS"c ( e@+Fc1|89] !?0w)GJcn0w3wɒ ԁum Sv@F!l42V}߭L} 4 4{회nh0 .~ObeR{S'{AGtxtuT/ڸ_@Y&WrI[#pI+ןݖFRnGRyu[I¹ [W l#p`ǥi4_\=!tM$q>7c@kPjE\#S۬bf!Y{{ɔ+{tF:: pjQܸ}P/b 1fj #UxfLL66jX<5}4wuG2W#Yi駖/~g'\OqCơ XRđwƜ][3smh+JvK-F|;/݈/pD0҄_C0tiZJRϖD#A U&-f*Uet6u'ASS Շg]oT޳6rW~ rvC^IF6|aO-mIJ:!%)R )b ZQÞzuuUu=8;{h [Qh-;S)#@V+٨ɻsQ%? m^`?:Q}߶r@U?+BgBʇmu; xۘ-THXJ-?nCφЅFAoh$^Q  : Fۖl>C:gӎrJYn2:jFIdbJR3 n&i,>u`JrxXH9[55&ŕ"h t)s8ԜJ j TDK٦RV'aB#;uhkyBbSt@;%@ m*VP2I{ꕌ48F^=1\%n Ģh-%$VP+Z$'CNEmrR%49$3PKgP;P;".3CW4y⁴ޙZ<)@{֦j֬QГ+(?ūLyUT{4.=t#۩n?vv7cO#@r>J|FᴐH0/ !HOK(N ^/ͽ͋()a f@g]x,8wFve\3;2j̨ŋ1Z},zpٛ7 K,<[$/tNqbf'i/-rG+|܁w,G# ^"i)'+}܋HrtŎ>PY[һ~bU[vLxWƻ~c?]ۆ[Kz|q1sq7=ch.e'Um튱GP~#hDyyaz\ۻ۷;>:[F@tⓛGh%)\)ݝ})HlTВ|;9%7<j)gء']:@$Fm,h/N!67|ԉ 8A\Zϗ|UhyN\.ևS^l3L]lmyd|fq}8??.goU#V#8("oT c;,`Ft3Fg}rVWrWy"1ռf}ޝ[5A;IXM/WxWVk<Ҝ9FQ.Ճ+{LK 93<k6y yk^h^"2D ^BWki6AFt.':E#!kR0I$7 lp ΧP焓/,duąESBF9)UQ)(fD9Džep cà` ދ^c-M@0LL[KtԢp k]:-7&_\nm@X\ b)ЌTBӾmhLJ*NrwbyFY3RBiv&= TWsnCnZ#@M*Cj)Gz.@*!>@ TQ4YD/0Fa>>C]u, m c8@>X!l4Ert3'- _D~8"PgS  %IKTZPA*#OtV*Yayn+'YPqReaԂJsO+&f&ZCiB;kQT$S Ozz^WзG$UZ%EZ9eE<:>P+YAӾORƤVq,L*p)a$:~qKV؇eN-Pu'K;! FLq2^3ϖ(aG||1!="]{`Kۧ%FT'5!c%Ͼ߶vZs w(򷳽t0f@3&-s&Ѩ}ee^fK?naozY|岯WY1""LUv*yk߼5qUEu;Qv`t'v<=ocA8*ܕn l|rh5j`vs}Jf !Oygq6N=WXvzIR)*=*f|#3<0&r5Xo|Vj"A=DwK[t[1~:~oGP'׋y(m kfԨbwo'HyÝ̑a#Հ:&㔠OΈuSQ,~ޝa=3uaڨbaGRlB&5O9bCRT`R ;TޯƮgw&T)" 8X(`ǧΤ QEW8L/.'!!'^;P,>Ldd("Jpc7 Fdt# h5BCOr::;W,kVOxܸVML|24(c=sŕV,tkevGSH%#spknE]BSlDKeR(0{~Sݽi`!JVy+%R@+ %,;|p|J^D1IqC<:G@81Wělm6iŌP/6R4Rs,zKS{]32'xzZ9\bB-CY>?dД ~6OQzxoYm1[ #^_7xP7%:$Nѕ4OW KC"jl sZ㜤Sn1h/%sQ؊&pFؾX-I S6:s[X!d"ag%/DHdApFWς!C4)F$Jv)T$z22C)=4FIQQko=BE,5DZd*\ H,BZ˘cT;"I`VifQ(>j /"5Ja9BԜikP.gf 9nZ<2:ؒM$@Mr2H"jՁJ T 4y9 *TqAH aEp9͆Q Y<9Fl$Acg/^yь[x"YRR|TխF@ Gd,(D?wN.bw6y x˒@5:[;/.yDAT!~qQqɺ躾ϑ0J0(_mF31X77rH\}J7y{SmSC^r_#9Y3o#Wl9eP˛{7 [wqffoL%F|EO,2}Zu>RQ*bTZ081K*2*F84l%bs( OۑMyb&Va?8ǢӁv?GBMIͼ4%XV*ݵR^tv\z妮fCKy"K33/'$N$Zh +VZS@1^ffCs% ydnYNM=m` o_Κl,Jd\5_*\ {8ZD oԕh1r924YqլϹl՜{#v0(2f2nñ5順~mGۜ :뜁x4Y.5IwY; -7EϢY MJk;uxy5" 03X"Qd2wS[c:'?Əo/qfG՘h9M''Z==,Z6d姶 { :C5HD w'iX3gڪv{]Zgn^JI b ?;3B 1ʒUbXe}}_qxk[J򫭈N-J{nKIUoW\l;?i"LG*QzǑx&=.jAlBؓV˒zqtz]Vl.}n I@DsbJڒ2S!՘ L< -m!7Mgozf.zrWrPI' t}n)9 ]q\Ѿl NwѰ2ngo ƻ$ofoNzZ{9>O;d%+SɖRX R!mT+#PyJ f1]hmԜٱ-C%LT6*2!TS_QM]B꧗N YAʢ}l,*Cڋ&`Fbt"*P tAg,YRcQdֻNtc)eZRx 8QQIu:(KJ6:t'։srq{ʩw;+kF "Ԏl_MVC6[cX6%f@lN`C2FTytbBA)bN~t<:B9h(s>4"H([AFk0["h xM+mD5Fl$j!x~ 48T)ql;(?l`ٴHr{ՉL8Gv:xܳ:e܋ ܡ;Y)+BWAQ/܅nytBYuJPN Uh߭ى߅{ɛsioZIޚ#E]TXl[+}U'9{AuG}{ G\6QypkeN}HJzwy=gfZmymAwr9=bSM_UFjI%)cxa@^"{5d4=CȲ&dm39!Pt,(;&.L@ EqFP50ϭ%s(>٧z b:ݭܮ bVg纣ZσTeu%1ü%ψ ǻI{s}FO+ۂt{?hA}F@~:o/_|Apӹs{JtdZY*,+Pw4foK 05͓"k tb.)M2|03/I%"O<||eӼxlFɦ`9OT{$Xra>_Is_3'kR9[%1˘5'0db'R^[c̝fԇѓ[Cd.pr]Vb-Ϡ,t]]2S Ljޟ#θŰ?VxOwC1l@okߚW<|~2^jL zI'w-력6gF9a酝^%b,]-ZfXKfp&sӢ(Vu7mCg7:owVH{-7_7Ծ7|p{>(̡!gHs%mؾܖZfvC3O>|ZǧS˞{8IXnPo~->]CO/=YDZd ]}v8GWx`(S@GO"`fe"Zo59)o,ӷ:ޡ<"s _W?&[=Ez̽^!-]n1KsG?5r*6Ы`_}fuyi ){A[JN0L>~XIw|EnJ{/ Ro⺅/4"a ͅS3.uQIj^E.. NZ"՝DN,Pte[N!̋j-^cIު?5``~#k{PK̹#ʛ@$p>rգU6\'r!䃆l7̥Q9!z@8y6Qa7>;GyJa"5ό}v;Pznצ>`S0b5gFO\W׊E;YkR[+׳(x<,zC)qrp'@WfLk-k/4eyj]G.PD1BQek<|oYΩγ`I-=SU@it\,Ģce䉲W$lcD-? pYC+m޼rueYמfÐ!+˷geD{5Gv,FWS]S."Y^F'{J1 ,=9W^)OAa0^{z.5և6DUL-eejŴYbuӏugǀksVȘ)gAZ7fh.(p ߲V,P#dtxPu4Q!jnn~A4,YO!* /9R޵q#E16{X%H658~dHylt$ꮪY,֓G"+U&,*њI^ a.);= #{\7<jcEpIc] fEy-FqDw}ќ&1]|z'0F>|ϚDy]߽#= S\6zO-7ogKSaz9_XjyeOb-w3k:4^.k6?8{2Ez[i ww?|{?]G1~d <#:W+ΕE窉Fs;T\2PA?]̍i^ ?.k~W[W{6 -u/VǠ;'Ȋ۪d"<%*O~e&xg~\ΠC\~s{|ߟ%Zd7/^!'1ƨE3$죳F(fsY뼻R[p;#fn)։.6l,؉(hA7B־I\FZL|9>[ם~a3Ij y󯎄s>:my(|*W (AH+̗hbO=Xc-T.K'>m8`ݵv fUX!gD֔ԣ8n Pe~G y`G̨vB|)N%g\"U|r7|r7'GT3D|rZ$YOʁRC5#|ˡavܟ[+;m7Mg+`:spS訦Pu.ݝ(A:-EtjM@T 2S`6/9z#geQ s,N+R# 2 JJtJ7;,x VvэHiQ i p4*YF* iU(hIrZOP6|.Kj0*Pa7|ÝSAD 1Vș/Ѡ5nƑ* 6e0E՟v fUX!gDCm ŹTk@* 6-BFUf-QWc^SqJ-"g"Oe"H&"sLRL;;깩t}*PK6('I R.nSC4%[O>:J^lFnqavOWo1[%znjfaE'>{kC墬V7WI6/~"dZ{Ayi"eeITLDiKHK"90!F~]BwĂkF.ld9ڀLE| ̦0Ll͍`1֨RcmQ-'s3 (%(4hNGb"bxڎĝeB؜`:bM bM2twY} v=?6@twm74yל75Q#ǩQ>5~zflќvU%Q!Ja{)I!%bL884"wgz1=8dHo)q]{‘ޒ:JgD"qg{ u;(/ .YYmP\)kف0#9c͇',UƷK>ϗ/[vBRV,ަ֋yriUrX(_fG!o6p;2> ֱw?P]\^R6 X9V`t9Uka5[660Ӥa YB>$FU[!@靉p5NmC,DwJTH4ObBFy8O`H<Uq22y06 d`nSL(Q)"5.gpõCay>W)gd,ϯ\֬Qຘ5rgJ`nK`#oFoc6'U7eqfR~xo9C•u\\^jH|Gz)\ ǒq.F&±0>5訹cE ls M]|m'U0op4gv\c1}8a },nw=alEhč M({{k1+d"7WAT*t]AɈQD2 i0]7AWcFk~gA_p^4htЩک$Aǐv@g!o v.9Nx/h8T`qڧBvStܥ[n|\"b5ظCuz (.#)𣻣bz"?)Ў0/)Sotn'}M%/ RB0$5@"9%$}p@-01q&h' ѹՖm Iާf1*BW f=RaKstFbA;]x$O\]mgVmڐʹ\^86g3Ul;G73zBV1x"WusQN&@]4dD*@52PM ڮ4|ETcB,ސ;'!{m{J@F00uPӦ3PI4u4g#ߥ91'+ %Fcl|rI ͜"鲑c7o3 $-1wfO3ZF!Lvm 8BcJR פxV3!kRqԤm#&-t{8P%'CJ~=("c²JJY J"xQ:;E4 GČr QfDpxPhĖmc%@1Z䨄`(hQQ,2ű(+T9Id Z 6w/z(*gTcFFbUT%QSBXUdҴZ[" +12ISjs%vZuḞ /2| ı <-]xO3@la[y ާ@4j +0&|}81bҒp‹b{F W_Z Sx ^{:n"3gD 9x?i_fKFS~zf_!r5f̈/jG){z+2:>3٬ /Oo2 Mîk2"_ *d3ɲ}?lq}p24FSGe2au"/7r$/bLJ) AM΄AWwC(n+0 cNE"kߛ$/+7^0o"bxz6> j1 FBp9ƸyȻh#`_j1MVOڷJwupX"m\>Ƌl̈́I9柾Ľ2Qߓ"17k1sL\Z)쟣x h&@=;Hf= w7*Eu*h8\DJJe,cd%(` uSz7$Sza?3TӕydޜQiMuGpR/ݩzP/pG-ʻ:/|LGElDu )ҢAQ_'X$ߝJuܙ\;B;3]S4=mUO+GS1-ħr(_S$'6 g oGb=pG|KKhm19{6!m2$ul 9"87-AqtΏw{}:ԿՁ/] >$dᱧDԡu|KYWxJsJ=)m+cW7)mX`EWVה6$d6m `Sh@Ч_kJ:o.:?\ tͧt]RS :!8sNk쐀J% 2kZZh{N) KSpLKm4>gOz/>sX\ tJ%NXAm#YB/ALM/= } ͛F!qSMj$J )^D hD5뫪:5R7丞N QRKN\b ۳(=[4ޗ V u1\:P 5(!C0dh(egҁ¬JU40 #~b=`$,Ltf{W/IF L+_w3JbJh$Lz Cu=X 3:XGU3֋.I^FK48Y|iS"RK`b;'H@ O]5Dyr/BVy~˞"D%#AI$}L" ெŵJb A4o#$w1BGf*Egf^9.Kfy`0+f1[7‹z)X ?%`KxM&_Vb`&!`j G>ܯB屸YBqXky&i7|7IqXx:O'L~Rcv?mĤwQ5$g,^vA2r·7.{xșDre/]Dg(-~lFҺ(CZbR_ÿb FuXh#hud-CY/8u8J*Ɣ|`fsM$t D@&qj\{Xc ]*$DԖIR04 *(p6S}jE`7cg7?m*|bnhETDZHpG€kBav'L><&":Fc!XH A*!EQj0RG/'#E x>oͬK l -vyl0NƩ_=~MXr5(r3a!$aGر u! 8Ig ϳ̖߀;.0+Rx#a]32짅?g<dmyą ȋtM@ d\ص}Ұm$WػZWwo /hGQC V/zl5 MKC.ҢN>o^Pũ=gVFt>7Čc"kV޾cy(1Wf>~#I }w鶌'|No%I7ܓڳ#!/T`uRK\5/"F!KIk8esӘ+.#v==PMr2ak*LJrAP=H O&\&wRk7RTBދZNJ "f 6Mh&2a~$*W81Be6Jטc"mK'\ȅe3t#T>:Kpv,JQ IubR7i;cX F7x^cntl 8vA47AE\EP`W>r*5>(#;NP,'5s/9/[vϾ-Mv # gέH/|ЪS}Ev}jK޳Sl6Kn p7e{Mm8ƺ37o ޕ[}y9'VC]NsR9`]yӝ!J<8Wsw P}-o,I`U'sKJ4oUmo#$5>9$ GL&$AXJe;Ţ(E5_5Wge-zi>3{rS8=a:RI 9H ą#Q֣ l;%bO:}-]/+oxB:G`"<+5 'K^?껝L^2b'':mqR^9A{v"7cѕ C7iJ=ݱj*|-Vu%:*s)>ܜjGRqj֞m6M.x>Z$i*(jE{u]^qkU.בoi6(r@VĩwFZe;eyFiaks?*rJ"0d f;WjxekϮHn+?7,}d3PϿԋ:\؜h)Vk?~/7^%/kVE0+v`k]-1flԷ7 Af44,-/nKqS%pRÖp$1*6Λ>mAvlƖjF8&<šט +ڔ=[m_꧱lT(z]T8mN{0qq{0 M bD:ll^ 9?գWk=HӪto#)yxZ{i#Meu7&`kd͠G=镦^Jn܈e!\Х*rw乐C%+:xf_.a)5%[FRP **&eJI:AH추ҔuWge,Ua[Ŕr.b..M5/-&b-vqVZōpx&) @Yf^Țކߦ]܏mpp],ch]Hn^L*UoSmYr.r*gc,1H<ϲI?4LO;<_ 2~wC^㡀3&Y~Cn>? v\iD & >ƏADE;#>8Q2Q<7r2ږv$L!&LP Ig?-d1TlS T ɂ1e'm<@i0C/C}63]AٻuPpf夐!LR>H&oX.ެRײ&ܣLR hN"KdB!!4AjQ\(!A TlR%Cky)پ7I6Yq+'3NX$D&+ LӰfnh6;01|9=8o>XPwQd$R' 3-,^֡3;JW~)C܍rp/#sRefxNC0 ݠh.wUanh9)e𦺘a|B)G&L)=G)娜r4J*y< g{FBH#=0y`B$^D;j۷[j}z2b |PX0<g0<aR*/ H2Ch+.n=SZ^AF5gb"h_)u2:O'٤aB/RzR RRJR"y/t7T3TsRI)EC)a7SwgAUjc#?XLvE0 4U&z.ENW ר 8Rg[l xTQ2$i;}|Ҍ(݇sƎ^˝T?,{I=nOWCkYiydGzY=8u# e=DS8Nf]9ZlK7v͟~.'^}.'^᲍]s2iSQHWWc4thk )DOpӠ3КlP '%p{S[ :ò դ6)Tm®ncK4 *wS4SGJQŊ̡t %%%NYtʠO "h?<B2٬ʼjgWgrP)ng Qbu)N'M}?ʽՕ{+VWO[WpP0x ߼F$KRsPͅW}~㣚tQU̸lkErT)[ɵSӤ_Tj`MToʊVb$P*D6~ K!=yjKڪ3BNZʵ*kUZĶYJGcr5ẹ; #$1oPi-&$FZX+. x2gfሷ˷nnY;$)Px;Ɯg.üj۰Tyw- ~JRpUTꥒmO<]$O^RPx 2DV^b5% XTׂRQq]QDO_"$`^!XɖK죟bh_J^:*RU ]O fޓ6r#WY]̣4Q}${|`3J<#ɳ./ْVK٧Z%Af,uX*N$A8KPϪP{gtF+ӧx}!]6_KKrgH-"Q$Ŗ]ZŨSHP:g/.*fj6jb-H(4R#@q p=? gn]{jk&覔d B >^-ʟ?>W<j0%19k,b^W  0؜3#s`̛FAĞܧ`ƆFLI'uSL5(eI¨EQJ2DTRd4KS'saV$ЇEұQw}+PϪP0q>ɔGJ)2Eɵl3D#jVh\j'kj2guJ@o ,R Y"ϭHe=0D> >MWіUmDj46T-,v<*2v9;IN)p6\D~zR6N5nZF={} FqԫI󎲃nI2&蛰8]Y}[c1xTj~%EXIؿ}7k{&u\`U e?[_H}MZm"h7zx-?rypK} cbS& <*6i$ d+w_f^n^}hpSq엠UFFsO"AMĈR^bcvTѶN:x̮GPbֶ#iVV Xd&Bv}&/9y`0%8>ꈸtm8GHpьlZ7F J Tg$W|㉻+'#Ѿw~ U$~͕Kt0|&wã (^\DĮ̍YTIN \UǬq4)npi%՘qשu+9/Fom2-hϝp[i4G[n{sPV`Ns]=2ʻ.[ Zw7%.%8YD(ùL(Ifyv4`9SΥT((H)Ȉ'BGqrO7M۶dEowX V$Qgw7ߛGb8*>`{vNJRB4 J !?LQt M7s~LR\tu(A_. f%IjQ e*[)iRqC(aTꜧ.)ɤ=c+ɺU>Ͷ&m_t5 +I"7sۼaUsJ/-`ͳ/o,}񻵣@1y }2[%l͢,WaDis;( N~~)K7ΨV"Om4I.тBxǣq>DŒ yyʖFF^%UߓlcPi7d5+ss1~5;A"AbJgn4V]K=hMzWlJpt>k\xRM.?!Sp&?Gx GP.kKc1svI*c\81L)*$@ָ07@:%_[jaBA:itβyz=ݭ._я}x\ }׊flc@ɋe+s#L$'?5' eՆI?o?p{ >?1AT J\}\/&A"G݆?<b[DSLrLQj)Eru-GA&ěRchd8޶ K Y\:Cp6ƋwƔ唳]OEС@]N::ɡC:RJW%>z{k<}?k]#sothބH9ͼӜ3ޏf9ڟs;5BJi8EZu"Mچ@=5S$2~ fp]*}zJ}_Rt˧ސPQ5 =)vC=+w C |pݻ ^;##BA6wI绋拉gȮzwEH%mf43p Wj4ޞͼ~Sk_ q\-B 'wWm 4߱ #=EԟJ}N\n"0 i}Gc׆L70n+1Vh>,#9H4Mel'6- fEÇf[<O>ޢ^zAtLu%|1y}Ӄ~0:W ؔP#9ss jK|"Ƞ7u# rR2Dcۉz$ dtWGBO®STji֜o5S5Q. -fsoY|ٟ?D͈:e1ԒU"UzAXyѦ@ѿ~6*ܘŽ]Mnm1-)sKw7zRG5 qqUo&O4m<*,ݹeS_mUx 7GlG3^\_x]xjfG*r}w֑|"F7| KU[Gnmi#:mQGϙ <6vk{ݺ\D[TS~@sW): ֖1u1OT[ Hօ|"-S^Hd h|R15^zVYs8z:R*;뻟5EPzo :~rT=D77N½^Խ{Yzͫ/C+r  `NGL2 M%j0Z++dԫ c9j4㤚Z%d`/A=ہZ3zV,'X*Bei R2Գ*Ԝ;m):NJN26 ):NJ iqғRqR!𛗜cJKPϪPYJOZJ!C;)H_@-i񱰝QH)FZj𳔞+MWrv3\]bĿw|0(e K:2**yuiP򴳼݄ܕViş-* ןutMjbEPe ,qRYU#2 {R:-um˱ n{+aWR =եʞFHηVg7W\&C҃u:٬UN&~}~ eͽ|b.ȑ Zݭ#oD If[XFNF9x)dWI5:K2 <h%($ixck,UǼnoUՂ4̣f̛nxڇ!V`G764`:Iob$۵129ѻ@ ~ 1BAcڙ'˜N9P Fr]"Ȧctv1OARCd2'Lqr(6 SP EJ"nuIOS\[*G<< {ƗQ0LƣRx `vli}ۜuiNn37}JTOIñf`tms "9 S.c Y4E^t}kϥzArs d^a,I[Z H~|4B1]%"U_s0 }hjZn H1d3Pg i-&e2$;!D( rf{ {3ۙ/yB^#a|Y/C=ہZs^ig9 b!QH)qRZ@-RzR#);e/U#%Ľ삚ѫemClF ǀZK.(-c}um}ZFw'yxA0%HK :IsJ;*+4,b^;3 8Q;FࠃSeGΜˣزg`;sȫ)L sAܯ9+Ll@ }QI r7隽;_ ,ጊ-Ez,x4i,_έŤZ.u/nnM3e,hr3GZLCuC>/ ?/? :b㻭;l׻3TD`Uuzx\]G] k].GWűyb]#\߈zx_oxТTUT+**ӣ$UTZ"ʑ}D3՘zkQo/u8U2'T>y6B #C˃f}hx`rhEd?hHn*QAv;^0Qn)$;^2E$[[_ĈN1hcF4v!R\DThN,=~n6@ J[, S}cI;إjbq$x>ճjZ}jv]eHRO2NKݶeo7bߚg-eNnؒ!ѳz'ˮA򘠴kHXP yKd[HX*G%Kz5qoqd3k8Ϭ*gh@bް1*$y> P*mF="JLM1L]upƇj ˅|!6pt0 y ED$@;0Xq결|-Vn?ᅥltDfl%nkg㡝7gv"ɦTqFʷ#;ϊSNͭa/72RlnM;jnMܚv,x/̙̂!/˔z9?`&ݵLSaNLW evהʹ\Tl:d2G\BQ7PT:gCQQdž*2!*G4͖72:oB+>ni#ًс =6N 1Nq >H˃G[n/Rxo{q|;C1#+4 0tt0OCmi4H9T=891Rw8!%gu\B9`:E$ s$)tVg?,İMqӌє9:ASBw] "造1$8s@SZ] ӢJar'Q͊HTq(JW?%Q12ӈ9d)vs3s\H.4D1Τ "#R{ʩ%9Ք:a_ՏjO;'(THiN=wh8n:(lR -+(*ϥO%B56] Nɔ"`ip *op i4" a7IScL>c 68-!%V;ާ%@VVrRwKHydj j@@J>Pzo,qIB@2ŵdsǘ z,4 klNtjV;s>.D;{?g$SOA?%{ga.9QRXj\hbO׹͙ϑ1>} JrNa6?\^M>,eц`wI)0H[=' DV嘓Ս eރoIq<"$x3crnR)ØXq/M΋!cfB?aB'6:14[9Ug> a1P>7w6+Pޝ=,C)0bV;Ai߉IUœvVQ)QPHU@ɦ]/Ua I)US3W64Hӟ_ݍR/l, ˨LvV[l}Q/(~X;ZF2NgX%㧧@iΈUiƬ\w3$c6NQ1;AZk(7Pg>κԺuܮn`jY2C]["G}wEЦlKf(^qb+iśЈv5|jtÿۺaX$fl잖'4I}T{00,MkX{v}&kr $xOا̹,?|ˇ^Ri~GWI(Ek]hǜ[xz*,}NsoBarx)dn$lD e0ʹ8c&fh/XN< ,BЗ/1o\%82TRj!?m =~ <6.P~n週;ȕ}$-I&q9Pf~Dk2(ePfqJB,鯸>S]fe\|#\0ҝM1۞n)>8ȤWDx)|r"3S 9|F83s:7k̘vZ#9c #M8c  ='iI)QXSq͆DL\*y$|ILip3$9 H8ɋ6׹rB8D=-plS2iqkNAQتc㌝v?.F8ׇ/\r-ZJy#2`8N}`xQ2mư3C̰'>6"ް d\N8yȿꯀpT18*3ZHnػFr#W 9$AU|iw.v`fmee'ɳْVKm[]̬bUE)D+!%[5_~bhVy5Iݿn%bTM^g5/6di[[f#PMF@Mn7Veo}j+!wu6\^;JvoI6_]Tڄy#̈ ]hXr0)֕~EhC*EDĀ" nu)NBوI~z1": .0zJ;/XpFJu ΍2JI8 epf$X˕$Jep898QgƌθXR64Q?>F~))1A4Y Dύx5M(RIMڿiͪEV26MdZQL}5jݞcAM6קn{&HfhK}=6s) B[Lú^at{I"bt&y5\H.\ZG80= /ݿ6+ G̍g{ Sf750>q3Ez.[. :er#48m WX%_|͔y[n_>{C~SM[wg1\? @bg E}ڦhy\X~_pݻ#0ϥ*B Oz~WVw|++zTY)B7X'Q/O=` '`=NFG85$J7pBD+ώ<+Dtґ,+6oFljR WoCtɔK.!ofb┤KWH=H(,\rϯ\N|3UHB:UH.DN91|Y~0\Q2P5P_8?ݵ .ڗZLpźK'd$g=642Q' 1T>Cfw%^}'CfLʥncz&O"W.鎌Y}Gnx|^/fWЎhwrؐ9qI1*RPCG iFh6('fj)=RS*ˤnQ)K]n=V+%.rA}ԇ" M)NFDgTf,FBQ+eZfJiZ\fmVߢ~A+HW(ݫYEU62*5P[MvJ(@|FrN &qw_KAN{nXzU5˂yoE^ULgs!.]uQ;ܭشN[.A>h?~]܈dsBv1#Pj7L1:ɵĄI8e\H zLeoRf{ 1Uᝆ NuODzNR6URpVJNF0I|o]%`bW;_ED|0 S2m%2+L Qzt+(EYMbUyӫXyҗ 5dxޠ!N-{y720y28eoq^)y3٩4ay٣8PXup1y׫,Dt)nY?~?#IfKSԫr&WqL}VL` z?aI)P q׶L3E%p<= ocOww~0#r Ŏ'%(ˋ?莽GkʱbydoH3ہnmZnR@rIfc9MNIcγLE. "Rg2 GvW`l[v[=h-`Ճ<'g3 >Y7>)uo]63 MT?O???c>?fpMiq|UvtӃK8t%Qw%Hi:)sKJ+Rx9dU8úAq0W&ă,1$\rh<~SQwpS{mCӞKeR$3(wBYe%,' i2fB#=ljlCGI ֥mk%nN\C3[_6Z axz3S!/?zKP+[I9f)t@ GD@ DdVLHrQ4 ?j+KjBK} T" sqܝ_m]Ƈrޙކ,W!hZnX`Ignm"BLϡ[~k=AAw!z3dE}^̓ @[٪RnS# *F{C&$[[ĨNwԱnE_AfEZ.4䃫h+zut֭%T;jYw\ٻuk;hց|p蔌)9޶qV=Z_vNunLО{8=xjFݛ˫fuWW_#ݻ+G7yT3ǡ};Ó|tz_2R*L ݔZҍ}N/COXb&J}@4¡Hgc 6")ÍN(孷o|4ZB5I9JMzWR(ߍs ,ڂUIF)&SRlN$ȓ+e>Ƽ x0JQ  %J8GR߸X4=XI׊щwq1C0gԕ R4}ᮞK YX]GXg;T[\ʞξLԲwBt9=CDptqrQmcԪZ>-Ӕ<%L S:C0r-v@PT#djQz,z}RdQ{-57E,-hjJE8g 867Һ,!Aƥ hQBE?ly d A-Y1cN*" jb:`c$a36j'~ڨbGxFs5 |jMFs*\ %(N&yANjY bMJ֑1r`'ȰB\7U AzL`t1:1q!q:(&  @Wlj"1b3 DP+0x;0HЛ`<5#9ai '([LrC8ItU"7uo$E-C]@3"z*rݖH9/4dH=%`%\$ h#L7Lok]s6l\L 1Y n鱜d0zvEQVKZ>S HXA5_jۧWdtVG\h%&fZB 'Ş jȯC {3W9a |S~yꦢ 8ঢ#<=\aw|ZROLD_FJܜj\3U֗&~S+oRF4rT|=aeoÐO~~R-t7"E[Ĕ}GjacH.4䃫hBԇ֍yFnmy:mQǺ})Y;mEZ.4䃫h+luSj֭-bT;Xw=J`֭}@uBC>VtFqVuwzXG'R ͐uGa|ef*]yϰJ;+X}/!~R(]"5Ja{y&mvc%hB)n (S8 CJ9UW$ ՘뤋)Ybk\PZLBj]{BH*?. WkOC^+|Tϯb{e *WOvĚ]#Ј蟝1kqzmlHl~0./}pX-\]wwUFU"Duc]^'*&H6a2KI9:h{ckK}uG6ADk>RYFdwHD(g4 +|qBR1}=Vn\<tvO}w뙏Mfap[d*1;,W$҉WԀR5$Z;ZPln3f€$MEt *DN<H$Ɍ3MYW8?niRic4y5 mDŽaR.rFA+ w#RTcjB"ْIeO=j@w^2bSQ9\A {#zL,0VPN xi߅9$YW“ֳk39Ą) !HUwIKkq}A0Jxw׽E1u,C%f4a.B,B2c 94Ɋ57Ҹ#;:(_ @b;l. f|BVb|w7Œ'Włމ]wpMXօnI(GGeݚ wOP1%qHM)H@ݥ%QL\3-:&JMOb x'6ӡnEB2.]{A~}vsSUiQǿ$ qHh1krG6d i)C&]V80Vk~j4)4[E$% y^vk1N~4H)cẽֹ|N^37 E0*{TxBQF1a1ˍK6[?yI> v|< k@%ڀΧf6Zh\eqWKѠ5P fF)1Y4)Il`o,QjO0GT»e:H`q[eƘ'ӹ?$$O?;DD, B:l~:{RQ/+S"*E{z:IitbΉMUTi4 mݧ?Mp5ƾT;>O?sI;z BotHNs:/4ga03g2 C7}[Czg_MB]jǸR&[ܡo}f!݅?L&ބ- 7!(Ih$ 3f)գ_ܣxSkKS ùO}sOJ1nܼ(ݽl C\Ũux^%E f^up25{q^j]ʻ@Z7kv i^F|#o0pgV[ ԆqR<{X/v-b):6{d5dLVeoH5M&%HiI5MRXZF2e2M=$5BGH9 f)CyGd'^TRtRT)O*q_N:Uxk{&]3p/ VI'U?lwex~ýR;t샴$]A7@7*fu2~F?U}x@7`E*mܲRaf"s{KV۝-p[.yXZUڰa=LDh$E;* ,O+DQ3N#Z7L/Jhrvo-kv:&XSI$x`ƴ2ùC-ށG =E#{d_ާHB C۠QcX02̵2'RP.=e^^U>(%:[36ca0w3BHI} 0 jH#JZ'H!#$ӈ?u~7l}e:@EΨ{9{wc N{Z c(_[Xs|9k{JKl>(4AmXwt.nPRR87򰀩غ(E1-b>Z~V[әg-5r4?l[m<3!9$!C'+,_~Eq'o)z7Pq C͔"h%WjRD!ഹg];Sp8-W*CgvdYUk-,F XZXK:#@2x?O`K}Jj9lX8 x&cRJͤԃ^JMB~N\C斋 ޻%_dDsevkKZSG^ l愣#5[upI\"S/qq&-V5PD4meKrjBXDV$ҋ)b\|\/8v?}BagV9G eL!YYX9OGϧ tݳN\rѵā8rYp0<'vItPsIɔc}[(yfr̝ɵG+8L$a1X湗\#jv}">$G2ZkJLAcgmLD`㭴3ѹwܒ AՁVJXILxÔ:X/Tvv¹G,DG(e۩k0 aY7vC+ƕ쮮 &\?X7NBR7"CR{WsLxҗ,Wb[Y?czఆnr?V~]"zw3oA)磇ԛ̓|Z* N(il=T:S` |{=lWڼ*bp;=_<7PEz1:&ͮ0a =jISjSt-euKBKIBjPAK/[Ky/&jV-Ir'B$q|d Ǿ9DŽ]&b, &`5!pkr*RdzHuS*R]h)Fj#!Wp5NZGW~?]OE$y"RWێhJUtQ{WGu4G a\?cыǏ "G;w;߾?x\]+]NrrqsP+3-_2ϗ%YgpuM?J?F2@g~v)K _1Pp݅-vC/_7:K/. 2申=P^`t/ p;NW-# FFW6̽ѥoZ@nx܍/`g%VU3sڙ/VY34S6{5dtdڠ%c#!}kݠrKUjlu#;ivɉgOy##SЌQaLcǝ2gTR,SVp`JTQzĦx=ØnRh_ۓ].v__yNV=b9FNdbCfH'cZ*3 4Ɇ 9!g\vL"J6maäK5-j d $2J7) 7OM" shP&.'J .vhTbb91m48csô" &cQd@9ӯDU"j$W~{+)*xsD1)AkJr@\pVbha010aXAyS?a(\M -fI,jm?!7?>}>E_#.&-|oӻn~=& Ѭ[G3wQ. ;??f ~s˗GwS"t (ff&˜$ x{h+k_{T~+=<}ݽ tS Th*wfɯ3fO n׵W5rm֤>58jWfmqL5l><} h9,& %H0ƽn4W=biJŋ- KMWS!wKz[JJ; ӆ)j9ߴl1yK$c\0* w^cL`6XWIix]0w&BH')7a;L%&By CVH"s~c9bX  XIi@Q1ovf .MLDxĨmGj(}#F.1N 171b(8TXKf1 fWWn^>4oHcK tj,}" q%Ug=P{Hg*6ּuo/#-buln:;~2ZXGq*;VCqKJ@[+1t 0A.tP!3=L>/'c Kx''uX0=s֚3wmPbt,l& oV-TSPq\-3ͩ&HQsTRr9ݺ_Uf:]֜jPB_A|RG}PF'T=Vg6nF-Y%z1[MKRƛOjo_^|߱um"xCplb 9t6ּϦ\/WƏNrAn/̮mnCX^/Kӱ-i^:E˥L=Z7_;X}/5ڝu/gtnmh^:E=]&@ Eu|QǺyBάGֆ|*Xd'N٨DƬP!v߬WV yzWvV\Yn6Ey""'p҇{j|tQvqZMV5U DZܞ"܌\lWUse9)Csy'kK:F2FÒ1^Ɗ :$DJw:1bZmx`f #*Ӈj*RZx~RP=I9E7q=j:yܻn M=~ VaYK91!eeQ˟œﳰI^l;l8.=odfT>}6V&/oў 糙qlI<_zrY'Eų?ъrfѾC_!"se5\=t"MӔHbc),dHqh4c]a7K5.&Roa}4"f2)vvI4 3? :ap:y8 wȝ``h5Hvr 825iN'F4iDIʣ04NHrlCH'`ӯ 7 '( QdB:DKpʙ1sF?,Qb8IC uYԡM _]bUTu:ƭx,Dưbʂ ̆nڻoe<=X<&y;/N?~TO5E\Dtg.(@ՠpd>OH MpuVS .BN%uzo\R= ,[E9uaaqd0Hz>B쿪r6.d0ߣo^cF7~ CQǭN^QYad85 %zyg5jȻFYr' R8Ƅ0-XP4EaR ӈG$U,B<j/;3_\ƳzTtGVTtZ>jמiƪ m\)tP.|9]kxCyH'\"TJ;rvH`Q%0(MeE&F1ڬ c0VCW 6 ;Lv7+=t^eӤ0ӈ{AsmxcM+F].vLׯ&: W˯30} ̑L62 `EX2fE"Ƒi,)q©(\2bHŶ:Ya&ø*jPl3Ad|c^-V'_2g񭡥BTI<njXoj[:M҉yF|w02,#TH4 Б`qd* %d:CI+fmrgAOwB絯`wQ߈)FUhU)976d%_?R 3uOJ?܌ "<@4(̟я?3[Ԝ y_=>;Eo=z O@~5f%>O ܸ{>oV.>jM@RdV1f -PN(9Ψo ۟_egF&,-)s>9njzY-_V˛vgmn}=ڸg/_ۇB y5^XCd CQ0"C- 3)>oº(EaE,+ d!XSJo., 8B=ˌ)$΃դVГgg|!]pyb$h'pDch)$ +2~ˤFCY$͹?S !` WFxɶfruMlM1Q]EFd$)eOStM#ـ 3C 7H0`cwMv:E']J-B1$I `CR)MQbj%%C8"s&,©qq%vEĥJ:p1Ex_ʜ M[SU1Sr$Yil[PlgURv&g,'H HцIqS0j bam/zZbEblL ݡZr<R(QE5Ϲf K#Z/J7Ζd ^xzL e~Ojo_^uy"ږkdGZ,-*v2 Eنݞ 'r" I B`!X Sf5IVRPŒNkTnM~/,NB>>/5Zps~9y0;cGϡ]^ld#=NSuP4u}X/ / W7c| bb*jzٲYi_o47Y~ij{O~o:(ɿf^s8 YCBl7Rrlasl7U'bP`4 (ʼnfة>hCRdK۩m_J l͸Binˑ#t3eIl?km=nd~=ΞTFKs7e/3}a:fx '`y| ?N C̓۵ ǩR+Q -~YݕZF.{_Kmu|u1Fh!˖uܗ[zAdߛe.秬ʵ)IH0}T*Q >IIKgL bHhɡ(k$>%&Cl{`79 H7ġH)! i> FPmUo-M1F"G,.u R!p$cq!Ui&I?:Hks(87 n=   %qQ31*KȦlPo6ؑ`W%b3YP P:!4QjADJwe͍H0ewv8b{_vn'e'(jY,Q"%/ uV}J,D&%05|_jKږdPUZM=L&B) Nu-̟T%{,״:;1`O/!L"\ `RRK=8ɨ ZD}OϕD#sq\0}P`޳39B oSQ ؜b -, c%Xe`J5 1@ 4 (**'6agi< %OUJ4MnMEz*;iÝItG2Rɒ3J4Ws "aY(%ddL"Zgˣ^)'L%2-RE:}>XlLrGD v:]vQLgWŅS!wq.|;;t9N~_~\>L3O |SLS9 e:svvv;OXl(ICh3:X"Oᕪ,ZdaQEUvv>*2E?nMzH|L` y…"pۂxEX[HclFYjWabW]{G en:Oh XԌSK䄛aR!33kT+ERqKs# ӵg1S~MKgZv:g١g4ћnߏ2Y39|5r9r=he$ Q ߊɌ xQ)PyŹer7\_ sWS#lj}I,4oȅly3%AH3T+C<oԿP f Z7ՔRy1uWyJr[p/Q1U K0PTp ǨZ{pӶ뵜l9ziʅO\DK.#3H{pHE=bZ6ZH6/&,W^uGFì/r b4tQ́S_V2фCN1Hיs%hA?vOi ~zSr1'>hE WTNrETq c2hJ!P ~ArhHr% T8iA"R^RP˫oDcDMZOr55ҙu9zQJ)肽~ް-eqsRYdk {TݒhE8춄sWݎXV%A2dd;q|75hF mUTީ*y<.]yǓ%Kn6ޜu))mRF:?|Q'eFNj/$AjlIRh.~s ՝( Njimj rۃg#~(q=A0O)FԊUNs:9ɇo@ȤuCFy;EWǢ Q]?)e~yZi7rws9+u<*5IFAC$>V*tF&1"Gm?|~pF<,S3IMuFەdeـd6l6 _,SY) p3^Bӟcrq0L>R|%<>k NFT~|wTk 4ooBQE]' & ‹+W&Wh$nnKx7츟1=yQ~djo/IO[T)`gKft=&k 5,%鈋WA9(,IgKLAY9ke眵% (4dC[1D.ɱ{ nVr CYRu{0@h08G8;hљ圠tgb6 % Nz8gA 08$j5Xb qEY ʲ n4r siʼn:D)FG&%]|=G ΅~&{v}&' ^4 ?mȴ"xI0Zݧ &я>m@Ij1q۝6{w ůH]ѰX"^]o]& Xl4N>4>U&R54U$I!Le ZܞDza'Uk\[, S[% ]xsO4+/? 39]jit~w89q́dy 80YM^ B]rFO޶x?K3k*/y-Zs,$bc1cJ4n#4$qp:FYac1<2\3 G눶A_7fjik4O)KpqxKf952{ w,/b3N.0P |Pa%/j=X$643R0LIK=zaU;H6h\[Sa PFZ;pBJTB#m^VQkwVJxLE -;UEhb"8ME `tWAVc buHui"Ӹ`'z-V[Q, A(Np= `y䙮7af2I&oZ&d=/IfcftbL i)B81թNuZ<=w1PSNuehԎ h;Tרc,wo>7:$`| -{t` mp4q羳ז,0 َ@Efs|95Ǖ{+qwhblgԳ^;ՊiG>XmCWwxzzu#մ$'lP-0^eFT;";'9\)1uD9Ny-g(x Q..Ґ\E{TƃM4h Guu;Ϻ/ ϴnChWtʰ8%$|`M5 {?HUԊF8_Z/A99N勥]*@B9Da~xZo?wGo2f͘ 7@T#KpGJ+V0ý:>l_Wo qTXhո=%epohunIj4~LGhӨ9Y"1]]v>OEsi,k҅JhLtit`X2/O u,HUxiaFc\)o؆^Q) dryw9X/]}ߥ gc6u67w~~i_!Z\_Rok ԒMۃnDPw@AQ<"رY*o&8=r;(F憶C.ZJLNϡb*!*TV!k cWU:聳P)j(8tPbFT@bٝ얝+h+H&s6m 6ǣ$Zjf4> 1ʡ]åv1 RfCIA2 8*DB6jc `Az5"Fr(%ՠTgBtδd˼BoEo9PTsJ1 RG*1 $EhmUm.~cFmw6X@$PXGIahtF {[XY*]qURkg!IBXxLL(YVJ.MZ/lͯSIonٙPhT􍆴3l,dYSבisH{|D;֦hud1xo)Y<&>`gk~VrY$H—Y"S}My;6L 2^>OUy7i!QΖo2}?Sbx4,t)Z8^>4Z?UH*+GSh&[ ˬT0ٷQ˄tMVڊ sZ%Ag*;bI=u@xθZˊDrml*8sH 2 F;@ E2-ōOz[^!uVZ*](CbEH6BVWkS9V J$0P˭VnjEfP%DvDsF؃ *2D))yw}qc}M-9`(E)$:?{ꬳdDt힐]? 轱Rgٖ_ԡ͵xJvߒA`5ˑ43I%$$!J(Kɾэ}ﶿ0~se~}檜WwANnIպ hL}HKw72oS4 e@hQ0;\sR0}^ඍf"k'^!g:{v 72 ,y>,˰3S)}el!-ϴdi)ôUHb=r4o]}_D/ԇO1!L'?z7G[27I\Z%{ah#Sg>ȭ7Eȁ}7rVżq͟CX*;вfqG"@oQF@/lCAwCp%mir oֳX g'n>d*-U^:#ȸ!ʫlN]mel'bEedo+bgc 6ej1~aO.'`3@.]],v}:!i)4 :\ijuT%%vC1ϔ)6%1CScX'@H qkEӲ#Vo72bBnW)ȡ;zl^5:۶_ꕹyzɝգq?9:=tAMo`2Qԡ31^jj8\(H HLMG+)~ډ=*ys+ٲvpziѺj 1ijʠ3JI~皼hi7ȈHDBL$EU Xʭ!Ps0^z.H%y}@HD) Z|%UpuUqg`&z#GѐyQjUa)漄<5$r DqF 4#6zT7to+߻jPUTq0Ʉhezu\KGv<cבOܙȚ94KT@Mm m#{@ONϜ5ۼ!W9S* >I'*;0gJ\ V㄀ww\RPLC8*++VJ(Q ¤7֣UJǑX5M< :؟\*Df1^\*Zf|wFfkV5tȍI4% rLf5arTZO7P_ʦq5jHܱ&g^ŕD[#KJ[(TUƘI%_%)Џ,oWT[ oW_pGmX̰cQ=-w`ؚFn:/nyxx e6+I߮uA_}*(e?]O郿]q]߬_!G<•H_Ho rgEaK_4qщ%q/hGqu.Qno(R(o35wtԁU΀a\zblAV\Vc7d+Py).mjtZEaj4zn"P\< )}6 H.R%ga룶g("?C{ۆV:c:X1P.;[[v 6FS͞CiNn6q<ЌCvNe3E$6:?h`hyL; 8hr9݀J+:ZB `&cez_.ҳAk{US:P^Q$F:I|q/6C\S> .Fjx}Q?̱{^xӋ7=H\64`:aJۚm޾NɤJØSe}pǂ+K/JFTSUq03!DAU:K}' uf k[HFk;גEsMuEԼdJq4֜3^P#NPcKa`-@@GW^(m6-GHD()琜~@ ؕ<'8vGoK <\S^Wu%OJ,<8j$gduta=" 98-nDlD(43>NQѫ.?zc˻_w\3&*`5JvC}ۆZ՗}wR+B @ @-,9;)1RV$cR>QZ3?Z&ˇilV'zۆU#sF;  v"V#'AdG0ɑB7 ]ޫmp% Os[~x٤cZVK[4sE//+aD+ޱ8DƅEk ,[q-5A~L/M t.ZbK8(H9*"ŵ33EZJWxYJgxc|ɍqJhʉʪrFؒH Jpi:r'pl;DbR*IۦbA#brYj"JύN+&Sf*&FL󯂊AKU^V8^qmVR%F1GU8C2EKX`ӊuj9.UMZ|Tϕ>bryjn1ï+z\F.9:nt.8Z7G3Pxaܡ$ $wDGJ8D1Ƨӭ` 50IOCs8zt${z?pT:8{#R_wWsI@82q\6 I4pvS'FM4xu?p8}?g_5g74>>wx?q`+VzD7/V_XyݕVFuL/?cߎ#q?p+o& i݌݆ Etv;yhmxuv!!r S8{yGq9h4HN Oyc̱:DCB"ڒZ:0jޗDb 1ܚ9N$N6W'nJ8~6ك/Ϲa ^oיc͛X(D$߬"c]b)>uoXp]zU};hRE?<強zM4>IF/iTUQ En}"krK_rDbNT 4 j=siY =P3gg *O.3=A/'S3_#c{YQl:'YȜbMѰ%[hHa!L2UX==]˝쐒2}6P߶VF\lg,qD(F= )*MJk,H髖RNҤŎ}u 7ՑPgXtRA~f(CCp*Ln(E5 ]xWJU9|6[f\k6j̈nޗY"Ў PI]T:gJޱ}5}/R^w.U'(iͅGxę~y zDY!4 :!ġ#6z+DW?0Kt'IRC*4OQ￯Z`D0ل6O&_DnAH:)!& |EmIAC_i*J%CVHH#|2hBғ$_ClM[[ L:.g4D{][s7+,lfЍ/RڜT.r 0ʴ#RNS翟p( GpH*lCL>tgVBykVRiJ}ٴ#)D9J>NIn.;=Zm=DqK~ǫU]ѵʖOWU\cU*ZeK2UJ5^(dҦ.1"LCV#vB4+`/=\?$am_0W?b>4 :8-՚i<8jr/b8 _x{0 f;c6A5;^E1VH IhyD)ɖw66˰n*RSQFhkU20r:(Rˆ&'ǩ88h`aXV;Mp-wwzY{Pi@ %c5c0i5Ԇd yĉoOyuagIUVXg>UFrtIs%NR6FjkD=Jp/#<%w›,S POjN)̈́#ZYx"C!LQfYC|3&gZ; :nf̸ edD2ay^a`DIS->, '0̾VsNFc_qtb"e,Z(d`4/|X=I̤WѿV\| ))+NfF>2J6ǃ-p:>\滬$TOO2]$UJ){OIE_˯WqVRZΗDקPXAROa Ғpu7_T~6#;!#k#'.B&%Md Mf|M堶GZ/ONlrSk+0i<ңZXh(ZiZBwуĻiP":۞$g77(oYU΋VT58q[2 ukXdTJ^o]ǕM;׽?/452eZ}Q}٤|ǩ`q2ңH(Z2"(G, ŧ)tkvOgS ߂X@V1=uwEPS/Pmk)4PLkX-U眫sT6 ay,);?~5ܯkG"A?g}o<”apw((/f :5e,RX_/BR<*),()`L_T<B&ݦ,c'Tc `GW1m.ҩay#'7^6eGOq twcA:b4ֿZUwB&ڰ)iSr*&\疘6vnR1/C@՗M c0Gy8<7 cuny#J/Q\Qvx-`PhQ?et&6m_ji]{)Ŋ}(; OUb::[ y2 p=GTn4G.(=0Rډh1y2Lކ9u{j-OBO9%=& ȥIٛ5inB9Pײa ][޾D{!OݢRT{ VـE ;f}shtCat%kX,-X_f 󺍕w킀 D#IGtunDi2L)hbO0BFɘ uẽb*PM:fgynЅF*$}!]C| R:'Z%6Ed*ec H g-"gL p{rK i+2aa(408vBĩ6~"Kj7_ϊ nZ~.rxٺ 1:[tu%5V\V1uVnr9Yi)9UV]smXS#^<Z [.g擫p/jވzsי9c8m,%Ԣ&]n t$DlR4F9V%iSey48s(69L9U>@9_oB}•[Е&l1Ier5ߗX^7< 3(.}1| Rb' A_]ݐ^ (Rtl^;gr!dX9mFʥ>$s<ٴ79AȈ40юL)I_i)3{,2]L \=v0j6`[:R'רWMB)N\vp* B(q9vpa\>(.rDJ~vb4vptsv0XA4*_;8dil%Uװ S"pכIVpx˩dEN)hA|qIۂ`na, VKNXpib  22fU;3F젉<>M-7v؂QY `yg25a9E̒[96P:+4rXڞTV2]俾M6)̠º< xjSkZ'緡wWLBlNjHڊRbU._~n5o~lkd]؎X䷓'WgC^85tYMd7E!ڽֿ+P<8 -wu]`.ԅW<|`'Xf$7B@eZY5@n&J[K]gwj[e`{0a + r jh*x&J7eeB"1"&8'S-x +fm[8uf+UkGDVB'#2cI7t#998$t>Ģ)Z"3_PJBNw)(xrSfU狔Ҽi鐿>LB r3R.1S'ǔi5dg)k+$\6d{eHM(_;s rS1J(uNHc ".5g.ө9BEji`yD嶏%ήJs>$^L )໬ܲ YR0?~n5kccgK>9z{O~nI~GP CN~٤/*|?h-'.BДYJA _f|MKzv^8LffO63Z)6=_9Cf$N9o(dSbF"x^Ya(i(i+d!N%tf$Q%s8=hij%_QPuVC 8 q@*WeXFFqZ; o,ۿ @nE-즪d5oWYeVu@2 1?]ft|^s5Ta]xc%5vxӰt~h)rmdk YDQXΧ*Ni"!¦̤ph X](C2fb=ZgD̑R#X+آ(|f rP(Oʓ &t.W ?QṩQLz)ŗ#'#+s牽l˱im㘍ЖsݐڰK vx3əF^ onuyP,Q5Ke=X[ si9 2Rzg,˥86b~y* oF6r#"KnW=|)E>\v,wYl/ n=[mHrrE[ݒ6U 6Xo,pegKMNJE G<$A55KSp:ѫS;Az]XD7V6oDR?;Gތ hp oUmejlDL,W'OFе!'t2Rإ r3l4z \J!ٟIѡsX8r̽,W9@K)eT\bc:G/j0깶?Ͼ&}KF8g37-"ѫ^DzEݫֱuXU,VhR!Y=MM*_$VR*'ts€1l>V"8g Jg,g+3`@"޲ }Z>ZK{ڡYg֪/{%JC,F19.jl3>`n5*oв% bbA$=W2ޛ!H"yhmkhL)+5RX&BNs탑.JmPfeeAdA(e{ss_L"MmfhkeBKWbvքTI].CX+ V{dX#6b"1:RNR9-1AX:^`%M˒65q,͡F 666Lb֟f֊/7%KdP :A 2ES~"J!^c^q2"RZ ]̉w rzQ[ΎbvCؒchކh92>r%G6QΕfΕۀVZC6%˄Q$eEZj-ъ/3"J :W%i&h 0 d#c6Έ{j-*F1%fv6 n?AXWv!~c3ixFC<T0pBNDo"-Dۮ> 1&U"W5ކ@s}[ t^~谇Z7 XO {T/Ỏ|Mjumd6Ja]^kcacLM')]zJvʾģXi2=YVNh *w1VsX3-l=%AK̈uu,[L1}%V6s7n#T"ϧ:HM2i+զy~M^T?0H{7I@X?=1B;m91[eR'`zWm`+2c0hT',1Gs:ƴr+Xa^A mx(| <%ֳPjUb/4S߻;Hqh:Li*RvcΩw ֞ڧrHCrS5 md NA}G6LXr:ֿ?ѻ !_)}c&8֗)m1wR nA'z!4+W:Z }vmo9$dt-f GSgX8 *i{IY<6vJĩ_V"UT%"$V"T&ѡ7bX`ū0Mv7L.nu3wsr wij?Se7u#ˊbEd'3Q/6Z=e '6ŗO:kg\ryK-9$j]KNÊ)'q@,OZ^BAVOH-e?G$zB;>"Q Ny a%Fu a%/q j%VM:`}DⷺTRyJ5q[N{زA G"]Ų̄DW'͋Ub- 66u`pqIMFBhPzcR.rxչ/=ڀ 1t:3KTOݖJ#Hd"d&IvSFY&+=²_z^؝7qiqSpj㚌A*[ITk3aKZV1^' +K;&ڲ5^FVp{vvyCH OzϮtfS8 ϱ=MVhkт`nMuv?`E>p)R? sˍ[J0OP;W`J=XmDx@kZòOB xؕ> ͯ`]ӊd'e糿\å%e}z{z'-֊֓ϫw\wkV_W>q¦x}[7弸gˀYhjy')b#s(,ҏ\ XͣAS4PKHqA"`̽*ΙY![K#)3BNDC@S06ve$Pfk ofhmlj73Le"h|.LSsz "Gs/e7x*jyneM*#P"@ɏ$>m0>ki;a 4V W+lM+A&Վ%S^p0#8>+RGdIbjxǫJ5:MЁv*/W(܌T:H9ޜW;(F̛J\K<KM8oEZ5r]ۚF"OOgC. 2 }x y:8MMbJbN693:GT.|akWH.@锰 ]8g9cL0)l>qG&rQfF28Hb7`xAh^rY&"/9i-x-bү"B#y"CN1lT"WXWMk#w{$DZ(sEerr@\xOgU0se?Uao+i~ȗO /xYз3-z ؼ잳+rԈ'}KOgGP=y'P"Z^T@{}ַ ,tAڻmS8,P;G%7}, Q",6*WAIC~} zRq=LAJ c[۾N>j97pDWKڭuF)$&@d[Tg?SdOZ1lfj34jMRm@3{7fDh\Ъ'ts1młz:ю /CW9de,=ΓL(f&yI]Rsn3>=~ikۺY1O k)&lau SdBtHմoӪ* U[Jk8 c{Ӧ U;&aB\vum|A( > 'Wv:)+J~\IU˽L5EEFpҀe4 JspYL)tg-ȕR &\i vP .+?.aJ<W"SZ1OQ-~*xm`\˵c @Qg JI@8>\[Sגp/  $L!6#X嵽n?՗ A|LsVs<&\P4,׆4L뺀xvARqf&Wo2$#YQ8fdS O 8xIhO!<Uo}xSX+ufWtqbEXBI[5fN?o"Eϋu܌YV+̋r@P+CsԊ(%EaR᭛9`ՙXyywF|8:t4To#efUskU/}HTh1a2B& }[G|ݬJC"/qZt!gL{cV LɭUBf"Y \X 9"N@Y4$vUӲڵ#IIvТB$~+VRS1j)Jkhe.ģt&8} DذРBC;$_Ĩ1d`b= 5 dst&*MN 4go9ŗSTZ22@Qk#qDYT SfJ8H?{8 p'7U|4LX`f-Iug:[~$%Ѳݭ'2U/䯪9e~_0n_ ņd$vҨ">ɡtN.P:ϴN˺!.JM&Dn>4DH#29;U9bGmH̛C.,3-Yd5 GwA.cF4y$GjS͔i{t#yiӰҸN+JJ!rP]T_vJJ1@ahY)%/sVM(7>wt0W%lάKsaJ1O5NLRGvs]:ܸlu0Yf>,-x̖fb$xŁ> 566)aW?L1 `]Y0'3~Kx,vv\P1㋻q:%,g9U6:ܻ4՟і0@2˭-3GG|\Yk[=k\@ļe#(>As-?ȣ4_&!@bF*÷C[ccZ4[EzpZO+ |BH6 P"pTF=IlC9P)hH.ϦpI+41h$tF)G4uS$DX ҂*mdAY:Z^m!:Gگh t()W:l 'p]6@t|uQ$y0QsvΑcެ(J(Ι#y!rC!cV1EQPtAR籝:tG PiP4Hij)5۹Ȳ{ :@Ě=*D>.R:f?^{ULܪJzxkIJy~_L7q5^{L%I0P/'P==Yo&#INXG !zj\jt~'Lq$}ߨq\T/b( (Bsv: 4q0wØSg}gOA؝BwWPS.2!PJFy IR50Sf>v!2?W)K1#l̐ķ45듸&,Ԕ̯ R&\VDVT D6YY[)D./iX)DfnUT#!/=o+Ș k~NJ⬴:#[R4=͵:3o%MN)4U֔e/VX̩uE$*KԹbTGw PdZY)yX*n4iѤ";Jx*:ANnTi1aK["RgTӰ]T_F'y[)K~~|FH nH :Ω. G Aȍ( $ՖiUW2D p1@^b2TVR:xQa]aR.iǞ[3!0@-u[r1myFe˺@7@~g˒P ~LQCΘDS,u:Z ȵ* )xBS0DsaR̗,yH=)+ X⽤R 4λZ(ͤ'bW?=vIiń@jQ Um3}xwY3#.~wާ˕Z~|{;aDl{{?bgMXy姶}PH=UTJ?=c=pq-_?D$M2ߦj$pjSTL75ꣾ-)slmy./ pcO;p1xw9|ۉrJ\&|_~/ \̈́{%L,!~ W"="# szJT@Onxy|x\x<[z_}xY^\\pV_ ?ޣ_кg:;awg6/>I\Y꽹5>~Dədt^Kbr超E 7a$ Z `zKL |U݂IOR+Z~ `T_F0y(4ңPYI7Q8v(,ң0:Ӡ($a ~14dEc3B8R'||4G׋BO$E4Օd`hx ؾj U7e,7A6M7:yѻ A }GvĸKfC{IA-|&:Ȧ |4jѻ A }Gv]B>^6<};һoD[663RK\>}WY6LHpe&"]˂OF/sg\UV*Nc6Ղ:YY[)8+kOJAYiE5(>7v2ٮ|XY=={TixenjԢjYMPDjc,XG('ЯRLR ''-pI7tRRuqdΈqņEf(et^Ocl2 g9%:7V:. ɅU) ` ()e^ט` ֱ^>JjY;d]{[}^FBK0bt U~zk\G=[&#I NeFèKf'*տ;'6@_EpuKHh'œT(ӥex寯α: R}{,# FlK' ',˝pY@Zg*`vnxr-)H0~eN*yQij&6G!u>FfnZ2,}Rj"1'i|D\PǣroWhTKs4p}ޟf8ѩ'7i/lA-Ԩt;e>~ g/S*d Vן aP2+Ǜe|?p?\Y` Ð֭=|p益y>_Igri"VLRE8Emhgm1kc*%ʄOe7 ̻_JRfQZ'gFV>Oqgx RO/S3 gt<ݘ<3n\T˘(}_é_GZПv_;D)i;P5(F acҍ=yZ$]CN yXkmsS -Ң5T3SJZG P"(Ee8jn, nM V*) cTVʢdQdj!ɔyvޙgYFh4$bd~$ŒI*`sP/~ ,#;r}L×U O] Fe=O*嵳Y{Bg򛛷q>dO"q=pՎ0F c} a`ndUvϿ\~?CyjCoܽy%\\Й/|\ǎ8])ӛ7`_u7Y{+&@_*6߷rZnPe5AS7~/V9D*NU۹֕fÑ}W*A?pM~x~nBGT7?=-7&;E}&]&BA[7MƷ#l6OfnIH\_g {f|>1EqIw-th"1\դ_d͋D,(޼"3V o]r~"U[>ZXgjQ*"6D(X^Mz^R|8 74{J|wŽ,+{֪~{SoC 51w S՛q0Z{;nV-/B~w] 7(RL;v;xU70-\酿oIx3(_k /M?D|+AU~x3vY,|m.hkX^oիL\(?(=wOJʗo[}^.|7h!Lfs$0~JZ+&m3h:= R#~-sV'l<BE^v%)3g欺÷u+f`1Umn.<K以2?zG) 4w]rڻ A-\gc΋ WF1wi~g΀~5=?mdiIJPG%1e\VG#9FTSXR'i=I5Ly KÝ|n+>Z>h=fRE,M1`($ nDhRV`45 ]qmN_n1\r."%u$2`}A'+"XP W$$ 1ct2R%uMdOR̗b{rxrCKPOOTCP)]ӎ1+)~6@dbOR@Y{d iz";P&#-4X!P1$\pyaC9lZkv:s=N{ԉhsWb1PBkI,9O)O,gEb] Bl`'C)'bQ>|4J rKnV Vtsg HMTH6sd ^h*g?TrLE "xAk9wPcHo8[X̕:wdﶇj!oYslLsƫEZMD%I>kb[oǷO]5Y;ΡTC*M1?ZhT,$ǎ?)ԺFք1)R7PwLxe.w\N[\[fe.k5[rJFe.!@132G]lЕ8y݆>:^忇65 6f-WS:h.EB0M^F"&(9a`AT"1PU"BJc+T0Pu JT_f(?{z`BhZj{bd}z2r{14()0qNF`LFݫ+BUW'׿}Ӆ!e?XmQ@[-"B aLVTT‹ւ+rg˻Y9Nps,'ψ}d` |s hVul_|߀JnїW5P%W;~'~Y&phvF,Pk[ʆ&BU VT)ګy0Zt>+_0 >7a%e` 4W Ԗ;9zs\&燇 &1K~+늂hCPO1"X-iT`D~y N54Fcq /Hc<8Sm'Suyi=hú0/D3L F/'K^N.*Qә/p`9ҫyнkgk̟'#ZeQSPNSXk fV8X.Q$SDԮF4al%+Vwqu YHekFQ1"z @M#ESX2*˼R Eg$qF]/F?/4ކ)f1,I? $O?[#$vU8L`'^ wv/+hmF* FFWcc(%!#F &IX/𖝜np9&We}3OJ^k~ jx5p^x /쌏0bC=FA/#Aڀ9b0]=}2Hղ[^xg<$zO ty}H `;E]G#>yIV0' Sd1z; _LD1ĝ'dlmC$"? ę&]spLPr%6RjOuL( - 1'lէ]ͤ{8'|*vB/ Md._xQ'PT:ΘD̆lMԵ퇦g*R8d>w9aҤ/%KpFUwjG و3a.ֹ}"ORH eJcg l@:2pk^}BGQbK5 P9>Rl.v P'FK(AZ8Y ( ^ۖ#>j&(ypXT[2xb|ts'X_g F siZ!eSL`WR"7]_iJˊlu39ɯuuq&QäP9#ʑRyڶ 0u;1?W3uYY]sAssl.5ˋ4gz67M`V($;j,!#@JB"(9 Nf,Qy2`Il|e5!n%lho'hi«p\{,ZkMrœ t+3&6,<~J܋IS!AZ+\4q`ܓ_{/DrVO"lBI~LGLBB0"eKPN1҇~(pG}6'*&A%mMm+5p%9Gӽ{?"߿wKdL)RIUiՖrȌ^CkDVA?)˚䳹s1+qxMW`,ȴxopA}0BKȌ\|.FѾ߯EV ˝GyK pfj$Cobҥ|B#K!Zw!.kr]v4Q: ,Ǖ!a'Q 2\H$.e[ţc0!_ݝ=I!RK*wFt%vb5 P4S:>nFmpZU01 ʲzYĔqb(4 #rK7bʶq4FFPqi<}@1PzHjd/~ au{f]ه^qTƢwN <\Hg$wO>-ǵ]BY-9VU!Ѿ^q\rz,L0J3}1P rdG% )s;Mg5Ht{uPpxG0ӗwK`C>_UTu/Sb1rcD Idq+b^Zku٥[ꃓLJbZ1pOzwk+jq(gdU+[TV>PQTY c\0 D&ضK5fk4lE6]wU)@N7UHH*TV˅{t&Luz?-5-_١RkG ;uVei5XXZED) r6'0mݷ1uy"4aD֏C 2K@}!GQO`աƥ+F09nsyqioS-*C8IY/8bHTdCCƩu :tGt%1KM1kÃqH>Mn& L z8,#R"/q2hWz&Bf^#a#ӺV*2! mç[V[|7ߜJ]ߴ6JNPFd8KmсÆ u3rEe# ܯMw Kz[;9!⦸~ͨ8"@FY%";E]BJأ[ ;=8Ma)TFHz族ޛ;cR$F_ V%C& 1'멻˳+CvO3>m,l%@Hn]y_<Y" :Po]ȭ;✼Kq|Zg^  A{. .k#j~2zAEq 7vGfX'8X 5EeL.:g~M*y3}UY'|/Co8nq bzȤ V q'C$3Ԣ䮠ߗтJXmH00`jPͣ(gΎL~$$SΫυGKbijsUK1ARt-R|X2ѺEcYOV =w2/|p&%!O)NCʦDGLo~HM{\)/{gwx6RI~K=ۙDؙsG G2pBj+:c6Ƚr~u21:cA)$'-rG>-ix 9g02Jxrt2mO{tňAS5o}haU$PWPױq2s%3l&Z6Lw,M& Is5Hk:ѺѸ2}FeD;cbf4?71UgS31"8i-^^U*(SHEu[<b~#W7"KIGɺe3i \\I~4Hx孛uͮ? q+7-gJަTk*dwfR0zlLYܓr:Ⱦ+ [Wd ˽igZL?`f"qyv6hD6=E.p$D n(ʃq76keV;ģfT]7񡸷?2/Aʑ\ 2bwHcT :OFAMZ,ANL!#/{D81Hn]̚Z!n0ǜt ,0eBTL*%!y}eP|9|VqE 2- /鑒QH D~NxE!^X.q'sc*.5pD<%: =iq{;J9ژ?KNDY&t55M?'Ǿ5k-`>mOm0U8'RK}p4޴,W7>$:O6T7~CR ++6%2YjZyR#der!+}o0 =^Ւ=gfjLzG%aBWdHhobx\/7r򐺼b%*/ G9B@5HL t~װ$'茕6H&U .%Gߜ0IBڋ$T"=粫TxM )9 di0X[7t\z!QV5;hu2![(sfYHL]ܗ2hc<ۄq-f22z$k%,&ܫZ%B!cD|hH &`=C6d9nCV "U@<\Թ&v]TX' | g^ Ng–Sht?l==V:4tdQojPZPno[E4]IJȱ ?ʁ=U0r!5,QaLqdOYіg|;ᗋ9{_ch4"Lf}Fԅ#¿A.y#azi<\,Z}OO~]?,%fRE,M1`($ ~IY EW`%/wXf15/C񐬲9g"(մZr$yƠiiOn W[ّfVٻ6,W:i:\ɓ~y.l6~Nb0%BI%p`TT˳o=̶yEbE DE =1xm&>:.Iȹrg@"m )B2jޒ ]"DŪ'u߈\j`;6݊&|?s=l:>`F93/ Nɺd/n2ɵ;y-{ZfkdG/p3tV[VAji[x)ЫJQp9uV{Ĩd %kp5"GfudQ@,WgՔXwEdŔj7uHlZa% 0F i''hTZݘA{ w`5|aZe9U]MhgE?W|z}Lh%0˸vYUvAHJl)ELȄV{|4DZV̨pCs@ri7Ϝ>31\1}F$# ]qÆ|S32]AfqVQ35- Ȁ 2, j5֋ Ca}B'g*m=KtHdny)iZN$#Lʰ__޲l1np|;8x.Ns< [}8M~rxmWbPO TƗ\cنVXLaכqBw2YףƬk3qZB9SIFݽ>w,?C S[s١ԉV<ˏЀuC<ݘPMgt>j}'1aȣn]Y%,/5m(:I̬8\ c`N!7Fx4%fӚu(_TOe:Ww4u253L$aKJ64%*fdEĚEvzMZmf ' mGN .v' u,sU($2JVxI#S}BR)J7.pG ͩH-C'e:5eDg_~Ynʟkqo|q[ڔw#lw=\ˎ̵-kP$-8!+;[o|ɮu&L|mp'/bxDLZPe.4^-o;tX!%_6{~nHO&Hr)`5tf3`ZHTk!]=³9UgLn..K-ky֮bgo2#dP` E,#Xp)^YGc5Z|_e,x Tv %I }e!ԧ.&;e/VQo݅hYGVSG'Җ*%~d\rڸc^ [P~a`yzD蚿LLʬ18IaG$:ƢKel.l JIV"lg 9bK.xU L:_pd* Ù)]_274[ҜLd `i(9b"Z y%@bvfu|0+4ñpc*W6lyyփN;F&JZ "^ <$TP@/YN[eD +¼.u3UUJW_W=Fe/boLhgZC#Fk1yď33w(Zk վ.2&ZD|ߜRbMw u=# Ȳ*jFPy%礒ف`_9}`,4'mvH.!FIKk Yۙo0aSշoߵ䖄K 9ͳ9 w2JO)h* i7ъW/zY֞GbZt4&:nH}N$'i0(qڢYwc/_pPiig`izNe6i鳌pYNiSdzwN.m]ZN]ƷȔGrFmþbFRyzί+ ',ꥭ7й©Z%;-3G.Kdh"Q&oq_4o.뎘2:>usΪEf]5ir)oQ'dZAK%,ZЅ\-ei-u -qic Ǎve>:D{޹٫P=}>AE0}J)V넃YE V&}JǀC4MJ.Ɇօ?2(YWd9v%GEcT.z>8i%̻{x+c>2d:6I Q%ӛ-Ȉc,ZPLIZ"~厤B 8W|%kxgoƏQ*dhݩ;(^NP@9vTHa)4[$`CqO(:o\¬qLNI+篛Yݓ`΁s \-W[v.:_vu$@ 5kk%5g.>9V%ɁGMj]/xct긕%SYuUf9gA•B{BIq MatQ"eGjm{]}iG3DE\nev ?y mz]~lџqsH(آv&_j:Ɇ HX-)u(>c3j}}arʖ7r&sWuT<=֟:Vb)9"7k0KB+]Eן>;Nh͡&=0B,;}'" d1#(5dSuʘFdJM}v Rڙ{T2WMOT)d5zv[0$FZ0TKCnՉS l3;ZO]cx>;zES IݢJ'T|cĖ^Aa E(I"(DU9c2triF Z{gY85 7ӪE=nJ> `@s0Y:Y|Ils-S>ooى/q_JLΏ糫ϝ50T-hEjx$-dF.S(e[aD%L-֟J`Vh)ƢIH2"@Lʚ\ qKUSX]oY؋8A=nweT^Ie%, a~!aKYgCi&*;##x5|;gKesX6Kb) a:xMφQh̭a]w>߹oSF#w{~=$y 6'[B5+K*}/vN߷$N<~i/N5g(5v^yw$y! e;SLUl$4/Ź3Y::/!ϋ`7GFwkMn@WƋ6,H٫'O#TY;/)I 1,s.0bD8}<_R9q^e,ݣ+ZUhYt6mE=|i߮'N~w"CO3Ǐ,E F*/2_4%X~ #~mfvJ9xɫzxK@K&n*OX"_9:ur{U_dq}RfW$'-rr/o]yB{Oѷ1yvQÑXaS4(P}dڰtʇ7QyDclN杖iHѶ!9^c p[A49(9G C6dȗ~6z-o߻ܡ$zl.2eAƴ$3WLdžSX%y6Ků4O}~rkө>.ݐC} htyYvPj.g\֑ u-&Qx̞ؕQk=qJ#|x}0M|u rPMn2t@G|fRM'/+%YXZqV똲6X)q &]B;ߠo `h'Co <XqoņA-)I|y,~V,+v! B˲?/_>ߔDNMabb:@hd0XE OÌ_>~,ӧ).[O{)Pn;4uRΠٟRşS4`Hf fDCHl7`x) _ HؗO0/+Պ/`/M٢*;*꧟Z#v"^E>|}q{O2Յ|B4wX^DF?{9KuyB/B3Z\sP,uÔىگx*~tR֤5?~?~DO;pK'fDK6Ƈ0[\hfpjW6'Ats P&!G@H­j)lҶMԕrۋCI]_T.J]-v C+%>WYߓiwzvvYx"hARFo>Vu/=\ >Xch~Bf=quKFnn\rK鮓'9;k%k{9nC=^;Љ>NM(IALb"WGp `VlJU&O2g}S#6x Ƹ ZxS8okų6у@\3xG-u4^B1hTїzhSijFǝ{be!M5~ںֿh Xۉ93 VY*ܖD[//~(fʲ=FfV|"'gia| ("X/%\JIv#(]1k իIxw0cŻfq=[`פqu;ٖdmxC^{ۧtΨB]=¢vۖ#t1nQިI#R4}epӭ@Ί^j1|#(<()T`NZ|\xR_ Љ#贗.[A*+!w1FfdG@'♕LX2#0S+ Liڟ,W`W':QendtҍᚡgQσ΄x틐5fL(ۥXeU9*1zĦak6|==WDpdFrZ،v-iNVCf$u١dPFD5<SșG'ko=ɹJgH`s'ؚ 2I :+R }2ڛHJKDc 645.0Z~HDs)Ektӗf+ w0ZrHiN`j8 5{dMY93 Q&kOs/߽|=`T4C4g"1EcXnh{%M]MimYD3&1Mn V"†OkW4'0#8'Nf.n?ή$ES`xv/cBA"s^Ԓ )3I"2PA9u3j48ދ{[ڞ@k-yS⤸Jj“ 럪~dU+Tն܏/+K&R=ĵ օEQ)fGNu )NZbѾ;QV[>`39gYLK;m5K p7/ؐHC1ՓON 0N,ݿ&ɷ{@:`EMM uh!9īVYg/FA,Ȣ M4)HjVXCctZQ0/6n=]G6vmִ^M76qs*ds׮ Z@y z{ʼaDMٟTQ;ԯ u۹ 3@} ivK[Fޯ׻$*!PD IfRd:+GЗAcܴjk!xgwܿ`&CQݩVmE Rzcՙ\ڤn=L545< iQNW ‰lXa_}ޯz_uq:Ъ2-;ڍ۰ m完p?hp=fz퇰=-܊iŚygÔH2X1 _gHȈ,9阶^ǷKnF@ Yض0h%蚭9ġwC1b\NL~U +`:Ab{@#dXȚzM<h֠$FT#ʎn?WDA.aC9._u{ F糌 ʾH^lNN⃐PGU|*zDŽvh諭_}+dܴ7fݵaPug '⮮?!)n gFk d A:˔'#Y8|d,({دʱ4re&g8N;h3^ y!(]1lx `}@HHwmI_!vԯp>pqp-l!~Z\Sd)J4Ù!)8F,Q~UU4U5B$UBJ(5$:a(5f+JBj+4)rE;3*Pt:xVijٸ:l7doM,YײQqAf$™]d!E͝M8bQ8 |~jfVR}T>lbյrQ]NPNQ`JB8J)|Qp7n#npUl<{h A)|.{яCۖ1-?j:bܕΫґ~~ A=hw߽Cu/s ъJķ "<YOoS,R:L⛮#Y c@{#Ͽ>g,ݫHy칻ep|sA0Ė ]%(-$T +ШHW$$9he8{}?hZ)[[`z(L@:F{'&蓠^˭&Iཎ6I2=PIMOׄ__.>W_VԑhGG5Wڄo9;‚0v* 򵠖o3ȯW8'sʒJ¢^`ePκN$SO$z($>@מٗ22Gl3B )8 ) ǃȽei7z/YC)h~h% dJ:=hׄl> %O"Up2DSSC_ k6\cMj8Ma{+B^zך*gDCNL.a5AKۘeJN^zۃ^zLռ7Y Ŷݯ)gol#J5{eH+UK N@͉Nߵyso ,bm tZu&r+ED=]"V2W.2h԰k(Ծ䎨`#wvw|= !w}|8TkMbN!%Hhb̵djihM/׹VAyǓMSmX4[e|NRQ>:7щ JsQ N\}dAw[P5V ;f7+D8RXD1Gypd~:^3|NM'I5^\}V4!H}FC ѬEf ?A5i$u 3b9bݜvKYf'c$v7W6'T@ݣYd7D NiH`Uw<5E1wm@A\΃oCx5[Nx4밞™E,>J2 *(<#h2K:MԘd$ M)):T~԰&*4܈N΅Z|5zN *RM_l`,0D{A'kx rR$%N5dI"/{x|?y$|w45x쏗B`x_x\]]Wsf|_&'bc8I8̛5vZ:?`/)^J&K%tG"adyP%tD\^ #GO$f4nH:A VK1;9\5h7#iB:̖lۚ}vu9aݙ?V)A AE/._Z=]yk2Tp(zm 1yEȩcBJFPT)i0K6V\fK#0O -Zx\$آFni;cܬY˲]~q~}{AS:z57׫W(aM ~G X&_9ono8_甯J ۠v{*9% -Ёǣ?(+@_SN[NI?O52k-Z;e>) =Tto&T2&Ye|.jL9 nry3_pˀ0;\JkjW5ΨPivz+5MqJpXY}#CKIZ 4gh2|ib攻6n{QECg@ iJUJ&Hkcʰ[ȆP~*̩DO M*r#sң?Z)~<8BuП$r)UuOy!j|f?3)5e~o"C'VwٍQ7w;F$ÊzluH+>sgr\m7S } 肋8+mX:{;-8#߶iFg+W>ZMܟ2]˫-h})l)Tgn^kv;Gu挡UBZ/RR†tTXuU@}tAy%PY $hLp~&T@2PQ2۹ݢm4"9uʨ '1cA̒pST TB'-i˸TAU6]S_;lVܩmop\e)(:*"s YBXG`)Eupcd%C[Trb4gM^񘃇VzE ##cM_l)))ҠnCG`Fܢ5Ȅ"`XQUQ*kt X%}F,R85ihϐf(I-\3GA (C^\2M3JӅ,d)GbӨ)j7 EsҾOxzz=vޚ? %.և;aR@ X<n8t"fq-IIig#Sک5'H'FPBĨ/9%zUj3݄|#ג'KO I<͹\3kODڥ˽AYTn4-^̱m]nJ $"_M-BvԚt"OL) aD@e [>1\Jk㔲x%kA3Du_,#@[cI{۸+΄ (M4_Z$TPj*N>$5✍gYܖ)X9[gr<]C4_ŢrBѼ6V_zgLqH'LG ayũ|[Fxv7o jlWW߉%hN7]>h VZOFv6nYgrj(ZPz kЊ?^ǸVDlzqSEqs9+er2uXTus31$U&WLQ6PMY",'-}ʾ(< }{an'suG{>QGx/b .G{}}$Yr0^q5j+#WϜ[i;ۿˢ k VrCe cƠo{À[G Mx"F<'y҉`]{-Hp@ϖr:Y&;FpD❗PD!rcɎHDBaT߮w79|qkQP\bJ٨hl6obLMV9"X"c;N4x*ZTqkR鵹{М*!y'eɤ5,TZKYpm!㚄LHe Lj+b|siI\WSRJM0!)-I%1 %(h\":v]=>^m6'# `sW9Wx 憧ֿЧB֭O)NWT5 \3/̀^ߪ-19p'\bKl1ets=Ar-h*9WnɻkQEE\j]-~.C1A {O+!D+Cz)ƏI?SPLSv Pj(1 vqZJM!E ~2'S9O;~lC43CY-q e ! ~ʗ7 .0K&~I$+jU~js0O&cUx%02wq!*39)`$<3Y5s0rɞlE4Z[r)Qr( "8,X@Z=ИnuƢ4F6d`Mǯ:$@5-ZDŻZ4{*=iMk-9W,&pO7y;j\%F m8|Mơiơ1*F`<~Iӳ(7B bKy)8"3Dd?//[Z.] I>Ϭ_ ~o_ťǦ0|ֿ!C$%uy ~Y@OzIrNr$7t=[ؑAj5 ?;}18ͮP""U^^t_2l7[WOL.߸7F>z ys\ϋ>ٔ4'p?DD+EHݽ,h.XK_a_Y|9q$5Zzu"2B*Zt[o<"ƵDn +ǓyW[deTl%N8! \%/BKWHJYB(SQ鈖Jҷm(xzKߗ}ӹ[U]W c m2]q>Z+N{n4 @SiOx),.rneĕq* pkCsHQƻ1т9S [5u[֫AUymdR4]`39Inնy*Pz,z6t9[)Fs#GA)AT2 -8"&|~ldh2nVDbie7eD7RCG_]I.Vs#rim{^sWn*#jge+Vr'U)QL-5yəKǪ8e]҉ yb^Ǟ-)},y!>L^ D^ %WWb^+`9hz!C^|D\\=[3߾xn( 麑= @#Wgq$%7e"5hvm>7es#>p{$e,/";Pbz:;_.#%XRմGߎ f\mJZoT8^L(`Ie߱Yrhɔ&H`:>[6VLQLiutC%K 0]Puo|naU'H0o2bgn@fxWdjےċ9YrAuT %VX/S\_GcJm`}וA͟OQ9\&.fMO,}D|-=Y-t$(:)*; ^͊| o[(OAU,*9L?tr_vL֩2# AZa Sʒ6 >3CUc9yu]RaE;Na|fC7^|&RrFN9nj#swETa7v|MGkA?s?D+4zf&1sB` wg9A˰CڌtABl{f޸,A#\hVhBJRFeYd C-,%c"N ;'B^ )v#_BW2JDžDa)I!tcEYo{zN1tw[V$% *۪M*fJrAP|%s/x(ReQ2@"k5!em`[TLW,_3ciֈhH%6?'ֹD\q ah4FUD0AN8_2`ٻHnp="Y|}O N>d]yNcS왑zZ 53cnu ~U-A䥳E#juS6 empPm:hj5ӠEZY0q@<>D,[UIP5.+ eK uuFjA!4% -Z?}ӇYBdHUY.p\p7n 3-x=)-ng QM8/R +pL{&"s0B3^N|Irx~{yu񷪠:Oz> t[H(ZEqPx/"Fheݖ@ryйL)Y~-a!"zpwј~/ًibcdqat3~__v0F=<̿4ii}ݍvtɥP$\sΏl6G&?T\Mtz"7VOmʜ}ǙPd 4Jƺjјzlvu3Fl]{w;\_Z.NPGz)ɡ 5q̘߮7lɔw+hCUr o(`zO4({sGޘ,DR}V*|O]|"=lw_o_ 隞Սay5z9OOWo3F•ּY\G#2\2<O\.E@yG<'㵉}ߗ}ӃɴXLF͋j(2[K?b6˫osk=he3@L6\ά*`r2AUcrpjP m%Y`rYrapU 2%! #^Uܰ&I,|!#z"͗W*KIA"f7V!h_BP @3=hČ|UtYhg7Sn: $lD(]s5Hxn^RER w u(9rr[kUgm~Ҙڜh'M.Y1$$n,4mq3^ `/"8HAJ ȼ﵆L Jq'3NLJٲsEx͍(3˵:lkކ` t w#+{+ZŰb% ̔aLAV~?6Tol Y>mNHӎ/,hiD*\8O5!aEBUT_Shލqq@!/ƘF,C?^?<NС[a3%PI@l7_fgc續」Ѳ1gdȈB&ވ G>˦l,5G!r0xE*&ع"U1 m%f( ^;*3ƅHnciu"އf |%| a]e%EByP2GZ0E(KU*e)ko$#Rrv$!fHzhÍX,$em$!)m@|Rh,SAR̠q|>U BI@EJ~ -eD Ƥ\ǂ^JdӶ*RTx\GmmA}2>FӴd͘d mR0yKHb|:t縐8=r2lBLGk^|DGh-5$I!OCV3TKltO9AK)bx)Mz Hs,h49?,N}W|[x??-¢KO >oޫ!>KA~졗e?_:J X:=4\̑=d)  s hpp;C'aܨ_%BKҚjଥ'TbH--Euj)ٖR3fȎ yDLAQ#+p/QvYA9 T>~?V)Iʨ%p}\Wl@)"a88HGςpA:&<70DRoMFh(9"/5LW(]E*QWI)4(-mi,k=ycLujk*qjz}@7yI.9WU E$S[S+0(+ g+snAˌ$}.I2Y)rU"!(7E[͔P&·4HNC`k؇V!JUVf*\\pCBf.苺hLn5cWx@*9fg W7uy1YIxM쫌\z&I)$_oxgJi0˳f#@)ARlІb<bDZi!&Y7w7Ӓ2P{Qvt7mCs=;XҐK6%)ePj"aFJ9tMBkT"EO~n1f0ݓ151hq뺸"!,ͧ '|OgMB"6{i̹35Ϲ洍(8RR~/ELcQDEٱd!oB|Ře3mYE+ ufGa\67V[J;BD^0̸)% ˲F|gcVB!*)  B!޺ - XݮQrƃ,N~EWK-!Nh!I&h#u*T죲 YBгvRc_vV>TBn_QmyH-9ayo߰9ʜ]ib$qǀ^%E_--{d;=#rEӍ,# LrdAñ# d 2D΁ eK?fIOb@ȿ3xD٘:|Yi$N-%RL_b%lYvl0mH ô( j 3Yh1YEv07l7Hṙ1hvFeַY |kd_Cm fC1NK0=n^VG %oϱM}x wt>dc7՛7;3^7rr>\õK¶4l-r:ö&1PQ,ʧMe2B+L<Tԇ d'6@:Fۆ/;B6)cu5 [^X9X^>yE^5Rя1[cӓKq3c&H0@a{cNӎ+gICؑY薃ۺMLo-rO$*td Z y9q|uX4TjG-w_}ZНjJLtGtސpݩqǨ -~d\b#KoTT щntbްF"kA؇,X[QgL/䑗\(x;M5欥"zY`Ra㴴# ȣRl YKO[K#=>14yJ[WrȹV9/KFqa7Rs/ V./H_%V%xކdXex.TvS}ݦZsu+>e8R΂-GnS)rSRqZ*nd(tm%3^ikTqZ*UZ{Z6ը9>Y- bh' T,||߸FKuZ;ɽ_K`;6空"UY!oWuK/> תyH|wWݽ9H~zpOwutTf0$;UW:^RhuK < C "6^,^B1Br ѹ@?֔@A*/B}Xan1\9p \rG˛N87h2 v[sQ ?R$*]YfRWʺ/ x B~螶=L3_fkCr:s=%-~gqU^Li2i f 媢Z/^=k是l4͇ $6hNe'[{( n@ x8n.sqlܺ;s9&# q gtdF998l"w*kk#;3x8xΎip59;;ĥxt 3D. f=F O0^KyEOkޕF迒YD0e`X CUvn /#/[qfڝ T93HD# [im'mw ic_%J,.S[\ӏYbo_d&CD~ql(V׷鄅)y;S[YZ7H!w>  lr%t>rw苮 daaqmli]sR___ŵc3Fe"P]}m(ӝtdђ+"itò&A+o*xC_kjau-wۡlo9w3~ۭ>23/P2\B9RD~!e2N^;g7Z;i'298P{w8=檅(swe/2a5C(?$L$"!7v< Ȥ.duqm ٞ2>f{=)H ԘSYdN &.8o;+D&&eItPwɃ**J ^~ORKD"ܔF"oK![SįЅ3SFR9{8j 4* t$j 5l;1itF/ B 'cv$;i,Z[g"']썇K!=-fA$$#Rep_jޯ/n,P&%l|FcDJ l+߸T0bL'(cLM^؟Rak?/7iqspqCH$JFut~Gl{q΁DvMWmg7DC43No %{]L/aFۧ.TǻBx]4uKW tPVU2L)eU`RF8r]Ъn!820[SxeXaP7#< -ē1ʥ ͺ1WЖӒ$? J>tx#\kJk̝ RydqTEUn$XJ nY*RmEBʓ.$y^),'(teQ0cX;,+ ҇Tk4wNscy}* AY~Z%t_hc ͓*ۢKqJhG>TzjbܸG}n{D c} tTN D{B=JNǻ['M*h5&7/f ZLeOa(U2:Y()eV,g1ݤ氕yiV‹ yQuR*]/1uݜ] ՈVtE_y&I;޼쐮%(]Aa߼DH| 5˳81%5ۖۍ X;#&l݄r&..nZ ˖}Z`^~ُSخGÝ`A M){T} ?H$b:!:х_:VjDU_'s;|k)P$+ʑ$eWV yeN(AvӾ:re4qT:k$\kM+P |K+JB]JsJkU,EՊ)T疆j<֋9Ǜofzo?/_- 5=uSD~XUۇ ׼ ~Y>~i+>Pۏ_*WRf.OoG?V{de!o_? `C< WOA|7Opݐ7Y7@,sfG.Qni #e|MTZ;%OV?nԗ.FH@v7rj#nO@5;S#"?GLISsZ/3jݥ?'!:LJ|0(Qe(Ϭ(3 Y{\gGxs[{N^z5}z5c'8MDq߉X͸ slּ,߃AjU*vSqes1a2pSbzȃnȡZbxqS8^ju%X aFA;!.1лLǤyT)J pW[d齨@]YEA -vГlc/!ZI؎PAXT$y275EEnt%6/תBAp,]+ 7I۱Fִz-Āu[ /Ia橫oJW*꒫w;dD2i"ohͽl*/(|PKeX9W8-Iyt{D6Ymѝh qщރ]t"R%{16Y  yUu$6,)wePPxATyF5){(_-PA:o"d&_:?R$kwK TVNW< Pxv?Z7ѮwsK4d!Q=T"F+>0 Zu?o >}9gMZ&%(n>o3ߚL_ǙC`ɋPtOJq|}^Zi /K:oy9o_m&כ{`5wxͿn?*Q*H*VE:@&0ҊOe@QKr_nL&ɒ0qOC1Y=|F9%Iŏ7v]hcf4 no\?-mduvL|vh&]=O7-<5 |LJL~aZ{d1yƈ9V3z\}~bSSmjC>VAfZ=hGiYCv *\<=1J/K*18r)6m ]*y;3/sվOǜxYBkw}0NX3;}n{2z0s uX # zԱ#xn$vy8v"TG*W=XC&{ea&0Z œ:ЁX*gǃGJW-A*O+ZJIÖ"H2T;"v}i77OHZ}+?bӧޮT}?7es͟>6̃ qx}D{25ڼiT{ M3ѪZ?"K-9q'Npўje]IC~*:H>us\~B6)3]ی2Dgf݆&Z147tksuw&к A }GvS hOm݆W%Z147uȎiW/!;p` f|w&6,li9&f\X<~~̵r3Oa_$DnV 2^fI)D u69#fPi8`S>6zPLӵH&FڕeIv#"Zh`[:oV?tLoF;g*yCf :gGj 7[)Ӡp RF 5Wϋȁ|^dK6|#9!8 X p ng&`Hj:oWdNEa+U*[bIK/Aspy^ @ Jւ6Vؘ2>DJ+]AJ C(>lQɪa^|$p'l >РU|5RTೱ5R fDG=3VJ@7-֠" YI;0Y$y!vxԎuXŕڵ*QڐC2v0f;ܕFQ;n;D&"/%@ ЏCCSo⚭TsCr?7EAiqSpj_C($jmۨU6bR&ԉ9)vяݘhm9xv&훵Fki{kv&`ԬNR˂C Ka3v'!*rHG@ iZ V"VW&ka/XZaiuyF ew 2Nkugջ~(MvJ Jڛgz$_if ```C6>XBWδ/uH+-;bVU=ؕ45]" 1W@4@9b< #,S4C8VB9@RxfZmެ9?^r}sC ]tf,L6oc͛b;XDԬM-Ho%H䏻2WEH@u@T:1u$ mv`.sF~(72IzmkMzuK>#?k) s@k81^Z*De]妯/̈8N%?27헻ٯ+$6`5gi˃$-f9q%jdbkDO m6,k-FDU<[UxՄqNɞe5+`u$4dYCv'+>G+%>̀WjFm"W@\[Ejfd)j;uqO-z5AL]F@FWmuc:='7\xѿa 8ě_j){Iz VJIza-~f\ڱ-g FSu{BB%Z1B LėZALmɻWe%]0$*1:CJZ-Z帓q Pp >SvlЉ])LFbtQȸ*p6\F$ֵ$3"(4̠0<'b"fsşlzdeL+ؓqUxorwΌrǕygH͵=jӺQ)?>=-!0+X{z"'Fp&4J{bԑ11RB0XZT$Zҭ T'e鱸몢 o*r6?"Wf{{ebFZz(*(Dj\i' ^zY1kRRߺ(B.\49;jIY/ &s?LM^FcrP =5k&q xgn鍛kiPhr@&Gaq\ -bϹA%8h;PN\R-}y$Swa-+-+ݔwm4F ufN#xZ ͛ࣳ 2 ރeԀe0p,Jht\{޹\%ȾsDZbWa -1G^Nnʾ-kGR5w>=~V|cNZ񁨚/ 4IzܵQj{!-/B*yž^H-WB)~&ԖbGau"hu8N,Q7u=,:BqdSҞGʻ)\;bPD\%J 1v.,erBqbS(هwAt}Gv*Tݺ_H*n}X7n65h&8rɁv@Cژ}z0K3ѶKuD̈́[*RsE}&VU!oبBh^ٌ*{%Bʮ]FaJJvVfT1BPK&TjyϸR)ҧqͯsSK#E'!v\-(ְ?n9pND6nДx*u::v0ƀE.uPk'o2:XRXI]兊FǜO =8HDQ[cr; E0t%fv&Io9J7Z )BpB}G\lDsgfPFzQXM4J1NdH1]ibj =Pe o4WĬ{\λQiT,!PyO" GkVzOU%.v յ p\ *ڠ xIz+ߖtm#Fsfy8O!ʺ6T J[SLh26OûrM- ONޛ ]*&&;MyҺ5JNOIHӳBip,814-DF<De!qx-b}ŷ7\WїK-}=Ҕ*Ԩ0ޯ7:Y\ojJXU8}' @ԍڽ!(;KE8xi((^ң)>4?n0-)XXOP[nwi8Ew1VR'Q#CӇZ*ihэ;~]F[p:Tu侒fa*HG~3Q+3͑C`(4&k&(CF8VZD­5U5IH`x Ckwt@{ElZ-?mEXZ=:zW(C-89çeғ<ʧjUzn7/A߸/V)K)K׷Rvj`Fo=8kpӌ]B{[Phdl8D&_8T#S j4?[-㽯8k̍=(M_G;gU(aY8i9)H՝Bb0=XvY<~ G'9vw0 } 1ܐ4om8o;m]&ZmОeϢ/nst/]=s;wؒ6r']ݒu\ߥӇDVTLo\[w$!o6_J/YR+%+Ҍi6{xzּ[Pt\7ޯp@J NLse|5_?t6LW ] KiR!*魬$St~hݣB19_O7AYcLrxbp ϸNǘt(6&WlJE8l*%K/c+WGAJĄ*xd+Æ*yn4UQp #&y]ϮӴ|.RN㹯^!ƌ7s7v,zV_rH3ؑ7V@ZysX7@yV N4Ro y[!74 mK].X˵\cO|Kta_<%8b"$PD-60 rIgHނ(A|6704VBXM%}&VX?=Nc [SeHNjtwb3y$s~ۿ<0¡h KpV"AhthP 'Duhp$c$c!p~ ,R"#!D 撔c#qEIeRΥDFEZLd\/ l)HSDIj,545a#8 XG8IB$EMZ<Ԛ,^])\S %&H Q`-%<fb _cZ(ac8Y-GR+Lr}_ BHgK#--BsTDGrz),HPuWAM"j*1Wi m2VxxϚ'BZsyą<"'`PV rQEɲBccɪjޔ+_Xrn?Inak:1Jk6/^ui[.Z`%EyfrMnbGdG沓GN_஌3f YïۍnH4Gm`aw͝~gvlV.-@F21.Icg508IA&j>5+҄  E xDX$_9A.py!]#5]qخO3"0!llRx7?EYe~L)P&EHaPQ`c 2A"*XyvT _K"?D]T/RM){(Z )>Sw>2dVN̊j#/`m‚v5Qxw{X#OI)q(S[X-T p8(,T4:%{&G("rv5p(17Z5VXv!&\,"ks8~1g f00Ζ`c"BJŸl$e>i7TSyD=wW`R]x*haư)G8="c)vnH`|3܆no= l˅9OO ks7T \*6YE`Qk(0W(H0:,Ꮋr0Kt0tK2)6qN8?DgdF8[)1a?6Vhb 6،*2ŵ{߮[]q`߇ ss6 H3 ?n0R\&ʰpiDKBe4 p*1qtRM?FXvH]~_RX8]poc~35Ο](fMPӉ}4z0s|eS: -kD64}|5h<~}t<tAA6|qfu*o֘35Q M8:g EoJ {&Q#z4ͮK@{ԶDVy a1,0ɯ0 +W)|Dg|tUfʍb+7VR+gphclԂ~o[AwAݮB{}g;mƝ4k Z ӲT5ai~7\IYٽfֺmfc>:M4s&4促Q33Ѽ^t =6-@plY %nU'm3ɹ;<ų'gMz|rVg|{fEoO7>}`Pe02Ankn>etJ, vP-xe[5 R8k_hq|E`7lt :EgnۇSC`|y޿?Dik^np^4uwMX]5kOzk~-g߽]l۵}nLm?yV%hlَԷ 0iL}ɻ}* ˸ߞ15(ܘ+/9# L?elQn«/vAIe<o'Gf^uR{3uXNOw u 2aTu.Q y0* ߌ:70`8޽;e-1o,j?AtLGg`|nuY;}y;݋1\ BcS ~TPAo?>,?n .4[S17_y i䬶+En˧$v?'o(s7[Ys)suuq />Tڍ]xb=yVcUÝtwenٞի|}o]l%pz%Cw~Y*^P~jqބ{`"p5Hy&Z=c9x/ұVgmVpQy`GKKK s%19&-l͛;KuL%HOHC}罫1#~jC!tHnL mV?#)\,- p狃\c.S'$O0̳EǏG ٞ'G|Rg^9D^8u^{FgU\5d:wN<5Fz uDvaxh]`4r[V4l$L)d&L0Qbap`Br)E= \=}=t"7}/MZkגb"I1 )&⩨\DҚxAjf;I&SKu JXkI7-|j5)pE(#VƓʝ;ib w)?lnL LňuHM8 7RSގ\lj&QJ 7Q$U$Cf2/\WL[AkM–ȖZ$BjV6)Si]B\V/*(N^*1)Pc Voen- `qElZ)’KV+^3g(T kFU/(`ⶆz:-TA%8-_]>ZiQ\A\WXᇱ(EQR%v%-P ` RQY" " a0SF^vж5 \+ѓ[jVDe;*2N%ߖn_?&._bny4ɶK&:.wS}D5N}^2+i&\-p硗ѕw|־!fOrri8!)D~uc? x8l̲ŁV7_Lo83 J|~W5_Ue-)-F2ؼSfyy5pvÅ;饷0aA |$;ZGe8z5^pu}q:MƊ`jbʫ1p\&hXͫ7V7o}pU^MN$#xX݃.{Sw%If'f*Hm&*)D\ӹPepuwn_;ۋj2P0Eq"!Sx :tN81!()JLҐ%aP ,ӭ-- /I_ZyRZ1^恂%t}{f&B&&2iK @@Q!)B$Ј;!@uB_CAḐX)|e]2Vu}o}܉9n%pg<1M¸ujۗ'GeL||-/b.+Rfma1'wi%QiWtJ_c[]N>X#]b[Z]hWtJ8[]N>Xc"$`֭@}.4+WҞ:[\!\ޤavX]}ް0n7x->&.ϗ0{s$SKi n@^&*"nO^_'XHYsIJWWF]F7i{ZZ&OZa ?gk` }Q5_Vc#m=Mmq8 3W=AZǫ:.dt=#%٣^WTDg;s{{[\wW# f749bEug/^ʯI/K^9 XBAgKf<.0ލî>;XUD ˅Zpmi^LoV W'o>301s91_!RUcâ[ m\:SK"{IX!ɏ.3R*p@l?epNfL3 /:<>"!d"xoq>A\O DՔ1>-ieSV~z7SYһ1deqv4u(7"߂K؟iͨ;F.8]WAw {mD!p jD*'S*l`MӋ|yWEG+ )T-%bBLMW`6;5Za'^?-Q -qN߇P/qA+JA#K<˯ŠRiyPL}nD`1I|0-f&82$Milp(ReH4,I .t+-*D@pN#BȢ8IP#dD83 Cì 6Sy VN]9BpZ IAZiK*B#N HGXV V !Ou 3/Q8TEO*ąU e(: Ͷ@(۠gzʑ_ J@?,z\YRKr}ȉsd+tw,RX7V(M&:MH!zdDQK%QJ0ɮL"8.j#Z8/E9lC;BG(L !t1=N[.rAP={G/UNt^y0%o2?fWu!Xd.mwуֹS/{E."C:  _ck<Ϫ]V=O81ct"9g8oGwY}7;{c)|2WFNN"@J"z~y^ђ|eOx|#Eਓ㽨;6Q4f n!&uGΫËi[~ŪQz?;НfRon }z OT؎%uֳZoNCDm>U*)?B/ iGou-)Dz,KM3MeR?^vp0 |.LC;6#3Zěv,-S9q(Kc qp8; 0[,ۋzQFf~ٶ(l0Ţ4\m"m,-΢fSݗ5 {f]֗F0?bǘWb?JZ=t-J_Eo~Tg‚; Z9 uhOX}o'$폼z_,9OϢXn=jZߜ?u?JA, "儮B ;5N|ݣu9VAIcFvbX ָ;9J|ζ$v*!6–E YU2H[N1a2ޱxǟY7S{NHr2PVJZk}3˪ZA FGʙiC >|G?i ࣑`+\" -qurNb1JذD8V8JQ:} j5s,J-79fJQlDaY!-RJkIM=>FzgGIy ) #Tt78I{/c9Uo7o">-9"`Q;H|;$|RāmfUB5`gF#D֕{\"^x W(7.ZlsZA !l¢cH'Z'OKHJN񍳊sOqFw! Y)6?6 ˡ^m{jTjS8Q“ Pʅ\!T@D4u2C쬨d%XKFTSB*AGA{pn;ZZy"1kI~ؗ.*Jw7n?&_Ϧ,\ b6^wGΓu]ve;h\VB҈ä_$_-of{˒],ii+=h>~s_U"`77cВT(j>XSZ,xL*LszK*y}a<yT+'OcEs>Z痛U*7(R1꣝qEg;RN#1'= (o +YE {)3 i#s,ˮٶdz|&)E>=;*IkA%y CF`mvyA990dE`^ [T bXhwÇBsq=T *Z^z=sV]x{:V&Eش@ӃDE ːFY.㍲z@NnY{h=#cEQi\ܒO*H3Bo EA&8d8) CE4ڱKo蔸YrzF.΋aIGwj@a"H;^tw[H\- an X Sg*16YI?9p"~Xo;(vi&Dab\yv\ TiiRH=HQy>< T>b6yW?iz=NLtbV _[I(g r:O{L;*X?Yr/a}m)VelnʲǖM(ޫeŰH:;k(WۦY֪?%Ӭx"X٣P!|(Bϙ0=Oxyْ%q3)t&S_s$$;w3\83\w _q^TXztG2 mh:x wP#SBX,?|.[)qjZ@u-흈րp%%9}tOC oHw$g}J8Yu@-各YiEnVizR+Dօ#joC U p ڑdK'JUW.$!:od2Y6dXZ EDUk8Y_-kOY  %J'ޚis`[̿ϜUZ.'6Ķ 199p"мqdNk=GRGD4Cci4`p[-tmASWBDmCuVTw\ AEu= L:M!L>,\BNͥ>9Z;{Tlw5է&C˜@GGߒ(#&|x :/wq#LYd%D'h,8)cc3O`t`џ6cוTU Ʒ ˙ypgdb)ATS-aNDgӎSVMX<ϭMhaZ%Ş7lS$Z}m|Ww?b(dGښJ"Ȓ gYdn# &OQCq4n.ל5;ح}NDE m#-$0>5!jR)vTSn=_gjaR$ AI8?u:6Qj"Xys֭y:b]Yiw& 2Pk?p-j]ڍvhwJDy|@]O" ݞŒ͋\Rѐ;c4lZ >qb^~{\k,z3AF0* ȚC-mtSW+q_4{ <XquHYM u1,m* ;FHbWU)n:6QzhcgoOf쳵g"ľDžfH B$S;vÜ #*KqzlfY `ȼsZȓkzmP@ (})9 2Ngc Jc_= fJ@6%kl\Zk9$Juv#=e)%m&JV1uEiwIE[ɠQ 2 M8M:Rf‘ iZvKwM3m %-&LB~3nخբLE(Um%RV&-,->P#9lŇz!ΗGz;7z"Š-J1.GML3Phzkva_5~@$z3k/3nE H"gzH_6M*ȣ?4l`3ۍ<%ZLR$iDQTQUiɀ%QdV~qffDdo$[ur8BͨZ,NKnYS0ӛ ɂAUܰĕl2r8 ՝'wТ| :"O"팥'_2g.oKu? .%m[pM<&w 0٣{FdP: [CchG 6M,M-uq|5GhI Yw(It )Q갖IRu~aYV;3ti 4m`I~JQRjV7<(>i;YGpfBϯRziig5y< R9ڬW^y5ْl5+;qE=(0Tt4f4,eƠ$7L_JbxT,Y|329Lx 4NptA6ɩ꺁<=Bdei[Ŝ:DP%2SZkgF~l9%JcZk6Y[;cl*t| ʼnةۍv+1)ڐL29oUk1%kBɑ@Lt"vߦףoHmO4\Z]-8Mw1bu 6%sPn? 5]|Memfm)VƲޚ)u\m볶VoKK z3ޫ8UÇ r|>E%r z9Н]UJ\WC7({QZR2z+Wr<6 wvKZ EbImɊbP~I7/FzYwXc)ӞRb#>~S96ڤ&*N:^ɳ%]͕%l,_{?ΗE3o|@Tqf 5_IOD*6Y9:tU&Z`k?@M[bKIHxZoIf3i^C2=Ųї[~u,GqQb\<(l\>՟mvO+fQv_ IaOf>ڭ;xZTD&3VA^t7f'P^RjFj"\'㾄 0&pW9F2X&.VqBaɐnS)!T.^G5u6Ok k h5q\64e<㉗&$X*pJgd 5# UIWy7T4x{7}tnՋM/HZI+AxGJ挔'ģX&C$G=[}-h j_c*6JEPY,G)C9#b%E,̕qϙ+{bW&.)ЃU2+!߾Y~<(5IS#2 J ,,٪v呺W:ҽ[k O N%"+T(z" 4hAs`ruixૈ҃|GrqM  Hb")iCC `j%Qv?,=8h@K%גrCrJ,lFB }t6/)@Ekq4K .txYBTpp-J,ysLEc+ZPeO.'_?,Ƅb(euQ3QKԐ9|"joX_lAoK/iLG9T7Ҡ8-`w=sox]eo` Q@P兩"*|JdFP^Lu?{bϦI4\$;':XY^YCf , `0wOh,3@(_xF)T)Sjg-3. 46u_` [.%Q-` "#ky 19$&D! glNEpwӒ Cʀq>+^Ĝ҃qʈ114GK#?%6_T/[&jΩ+Ăs%%EX#d6E%|F%/|N[IFiPyX#16Eg.;ʋ$8jKv  pꚨ&Ck&n$76;I]i@x 2ْ`c" 쀾 ]#QĠ2(s2gj%.E1P`elѮ_ ==ְ!FՒ)*aPEZ)F@@qQ:+𝁷.l3T5Sr@EPI2Sv0^%olNEz)2u=nFkU\-S#C׺V>xϤ+~hA_Fk9?j1}ƈbӴӯaɰNޝrt_)~SlZMѽߖ'wU\ Pdfxh :T2=c`B8=h*6W1CsƓ#5VKvzO]wӠdF<_R]`鯻C[uXi'5̧FWQ](,ًl朢w]0 (?̞0^AFM \V90s4B 7$% 1vj($_՗ʢӊzܖs%BK'>+h؅x"]-XĪ,=d4dim%d^zt[5'eKuQnVRЭg'ӱOAST u~;;NsY=Wnף˯/ ~3@X%t2A(mf-jJchxe;;q@-A&lPd 29tAAH<6. 4El{f?FʡPRm'}"EӭW^]WQ&ohg_E\Q ]ȴ;H-cώoKŎ۩*'kk}2u;FtՐǹa]iK },݀fk6&2v (Yq-HY}פ}C$3]y$ȏ*{&702ڜSۑJ@Jt=O3\;elCCk7wR<菖??N#wk|gղ4juw*nƶJ+*w< [^|')L)D[ΡV)eͦWp08|Їà?ZADxq7hm< 8 fR>|ggԊoYX}ז=zߑKq$6;{g.z?k}f??k+~-U(Hq6>].-}&'|q^gm:?khͯg'y:Oʮ6zBItiWkzwS,a|=R͋S;FCo- 8aNn/}re8NK SqMMmάE°.߮5]kr!coq|Voa8ۻ i^[YG'gV<f0Ҡ }.|ƾvkPu!rmS]!_| nmiЄu>cQk9n4n]pȟExJ7)?zka_޵T+s=9O'˞'hKQ}zvV^h3@C}VRu;櫱!Lu~Jx J_fMNtDd[O04DE& Y,={z ;F+;4`2Hl+Cv:bDȉgD`lF=LQ@d: 2Di9kgCf}>km4Qknq=BMyHt 1`l'SsY阙EĔHQd$GMJs'Wx(;*n 8 %U/[C0\~\UZ qΕDIhQ[ڀg RS__2:M̟_"&S)[Åw:e`piQseozl6$"7Sgi4\u^}/fABgܯ$Fe@ݔ%夊 %jQX2"JUBO ]imt2mkBɠ/K7%c:BT \l.q5"%KȁF,UaF7hO Dr3 _QI F+ozέ Q nVEri-0W?Re7#Bg59lKgô\ i^AqgR@WSAĜ qԀ^xU*8eeqy;F\oބlxԲn&ћ[{ YAլoKmC.zkjY=KuO&CHSF<($[*UϤI+B^Pm8$XCe[n}Q$;NPvcGv#[Շ4V,~ 3<^ Ta?(wүe7]ƬK hѢo]]t g'"g;c7&jQo^ル\Nӳ%$%,iy͆?Qa2 m;ٶO ] īyQ_GGXsbӸP >PVD$%M cd@;y'/]Kr-A%nyu ANY3isUs2OjfNb{Y nO^t"q5iD/&zb|;_J7f="~܇5< [XZ2;_\?^$URBu.0Ƴ%IVGV;&ԠS`MNMbyպ=;5^HȄ-2#; $v~BɫKVSr='|3N`ŗWSt)\TJ *(RH,,2`e d]&k V] c6yr[ȁkh[З}4%y^ގ5Jپffv &O;iǒ}v Lv <K`9j׎a!Q=xfVY$ dAAŨ͸vd:yzɛ^~wYkp* zc$;&g  HV~S6sR?NbR}Y$i, .u{ci:*ex+58 a<YDgAe7O:ttmokd~=Y:ȸ-NC j?_D>:` {o"ƞnס|WY{f0X vY@28:94AQOn,-vݞ5XoתgxJQo$eGtͯon9"p»|IA-$N^@pI:~oHKlsC>:91#2d_IfWHIC' "s$^cKU:C1{$jͬ3U'U&u(%*KeA-j'r!e@1'qJGy/+F+]?.<{02Ipl5* .OjH#[ЉhM%wψR l ˼tt"Yq:C)X ;3e&fdž ۻ5RȺ+6-)$1]Ŭ v%]kczW5jl]I: JfJN K峩{W_Zlf}" Ml+= [b`5%킕ew[d-ѤgnWYb{Y (9[%ĚkU%w rT w{uc Ny"Id ֑5Zbd3%o* l=ړiX)X.1VdS"`(oamȾKZ %Wg m=8Wx$$] ;vwrI%A~( 7 ic` e9T6ڽ_Ze c8EB?A)cbNJFCм(H$9%^wu!k4A"l^DвD<!ۙ!0E-E5"onIxѝVhRdftm{/)%+NYh$l[# ToS T'\Q ?v29)^mtzZCAx+`^ZzEd@[ y(rq-VL6k}CxQIĔlty~rP+E8"ZLA+8w }Mމnпx_H,xH >aN\B0+w¶[Prs-R[(za3VFoߒ>D UPu>ֲm.QP-|,hLlo^^[\>UkJj2Rm|Z߭7!NF!ѴWD11 & )s#z6/&h_Y*q u )YZ' 8 8&I6j2U$YPF%fDf.K r:= LޣU7f^VŜ Js9F, ![w䈳'&Y,v%)+(F%$1/aI.JD8fsGĘ?t΃1=;\;"ľSvg#Z";t7'Ⲃr.|O.Bv()-H;(QE#j35pPzwx7Kj?iD)HJvZO+_7l9BD"=Jџcɑ1f ^'14ݪ /F'N@;(#`e2="V&9HV#aTq*;u'2j,S*bU\*AUzD a&ȵ4gԓKNgPHF'NĽr.t;kiٙ j j**F,Z15*PQ4G v⢅SH`Ih(<,R L D'b&Oh N:ni D(893c"ҸA3Ai"'۫ws$%be>>$XQDGhBC աd?"+GqFBJ iĘ}_ nc4iӈbDF4b̛OXJS- ws\2;=%wOg|D6%W'bZ$9+Mŋ{ٟym*/-J SeKT Kֻ'v5ނ`Zw'ŧO|R[% HRF[Hn2v@'ŗ$fwV8PuR$B IOtuEp4! АI@Że#[d0+y?H|?x:_N*\ c=-_@F k2!v[E{Zkm إ5=6Z7L'!k(1!{TR*B yizi Qa/,?{[vBڞP[A D.vjǫ{q=W)48D$xiNWa\{ZFy:{+!F>."ê/ /vɢ?}؜M:APV!ci=Tl'BlQ|||߻k~iy60 0[Ԉ]Y<1iU6ͣtxкZr_f%ſz ϽV-ڼNˏ|(Nѣbg@6|v:ϢU?Oo߾OfU,Z:{N8dݜ,J 4=\BbJ^U4?MI_.e[T/AiN&YNa_?I~o;>mr(K&[YM r: 8>bm{Yi=}ro.G`}'ཀྵ澫}{pMv)}M9q'vIp<<;/墡ע(,c`tZ? 3~2[_6bDdZ+evP:~:= 7Oώ2WW#ハg)NkH֦;;e5앣)a`gFͱ+‭$I 3S@B?lN}n#>X7.ě&o_s7?sٸ:R*s'(*';EwťRq&t|n-A˰y=e(\=P/ѤLF:9y?X)Kq{e~G\۳T:0r 8hޞE ƪ}hAB҂vӎt⃫Wg2zy^t;VR꒱Ra* 5v)/JRQ ޳6m$WXr쁚C/jkNSR` LZƹ4 -˛Z8wtSe3G;>*L3w"H XR!,JDݡoG3`ohRd*ER͘AiES-R2M1F T+(1dH56ʪ +FP'vNiRb\n~:  ӎ F|f. Dy,!,!s>4q U㸕@,#]>RzXjgoxXGP\MSm$۠_\Vl:- Y 'h}d]@Sڊ]b17IVjo=R eC+$%}f1|<), "6E"3C%׹Jc?GZtM#R2SZGx#ok_yӊ7Q.2H`p6irrgˆ3DHy%D*[X9CF /Gkg~imڙ4$#7&c 7!tht3e9@T'DCfzȵecf3fc9svHHy*-@- Iut|K؀KQGXFp3bAiKj\f(=`M}vJ-4f@a#G@O]<|{Fx}WVsBY˧٢ ZSVVk]Sؔr$%!!7oWRy+`  37`Tfbɐx40H=` )Yo`lŪ4hA;!S֛.ia1Z}!@0{֓C"ݰ޿Ma/5*X챡Ҝ22"xLn(arQiZ4eYw:'hgk,!؎ QSk>ݯDK4ޯC_ZjpyP2 oDJ//v Z/5àudv0!)y5#,wV5y/J5Fm0=>$ r[#ch a_g$u@"ʡHԦRLUI~[gYŸ'_A0,<՞ ZāBߓ#nQYV %McEﰇ>lqxq;CT(/X!BIko]䌔e:Lu)_bCF%xP͈XkS @GEp!|+<. ZST쿸vxs꾻}V́-b~}A1\SYs{5ɝ3K8إh8&C΁G@q(p `b{4T/zJW{uOAneKVJFZonFnYrbb(_iS}Dܛ"ay,x\\q8-s ^>s :%߄oL`:{+A6ÆRN,H&D@ ,n!WWBV\{<5⦆.C`Gi%%M2Q\oz #=f[آ(^]^F_T/&5M=ƥ7Dsj ,N[ͩb\?=^L==63帯NJ#84zFa%x4wXc.HɅC3<=jC 7H.Ꞛ-(N'N.PYJ$:D'1DdHoc"QFҳF\.`]~B-$(AJӵγ"-rkm"2j犳 l38ib C޶)%9YORC~c6-rz8m[ p1k+:G4J՗u6̏5qu&Az+{bBt=1uFGꐺbGXJ_NrIqk6JQ/&KD~߈d+փHA)‹b1yjR[ *" mrsB1,Z;6k+=Ϗ6F̮]-.h4-Pcr czO5~M9G)mX *PY{nMH+>N6͈!%5>DAݒ1 9qm8:P9ͧzgb[dWw4($/5QIT:}E 5/v[/7ǁD?3H{"C.%yi)ΕM,(aL`0D0tvd r\T):;x8݈ pyUP GRS%($P9x2)h#WY4caJ]Θw"[8fC a7!P, /o\ji}wPrvj8$:PR`|zPUh\n[O||=iVQ#EKW0JUデS9޼:_1Vx~/a/Je'|m?5f.Oj4v-ոՈu߀x1sxX ز%&rJI.wo62AaO>n?@t};RÒiu U[u{_]ۭ(^=Śu +4MRJ5K1@Piv{]y 7_54pl"*}`)뜑 U"닥X bR!*φ8(U"¢+ 0R@j\>"pqik8mV|~`^})=/8BՍ P7x6%"dH$/]v좁On؊YknQ7 %CIptŦ4!f7Z.3Ufѧ1*C fp 7׌UC9 _kELeCL)Qٞ،0qR 9"аJ4/èpD[!P+EY4Ps]M# tUr${;ŕs$s}wbjyOtȬǮjkpXo򭠇o_T#Wo5!B{}="?_g{{\Y=ݻ&U=߀getzw.nп>V$};f ͡ŠrG%'N/ '{FN do;>]v6бGNGNw!!߸FTmչfvd [, BD'M鲘Sqo-7кu!!߸FQg6ٵn6-!&t0S_ߴ*кu!!߸)ԅJ e6hDžBi\6 6>u,dX`;+oԯt,YЈȄզ#,fZ%X'Lj+1ep 6"f 9I tssnG&9U,Nĩ\0k0&0M1;,%N ;ND+G5#A%k1/[0en|B.ۭ =0FPr[n8bn t V7/RV n_V߸DCx6:E=dX0N1(anBh2?`xFSL\\:ITsx"j8I6Rl8GYqgGaÝ3P9B(,6 !|*7sC nՀ$K(x z;n5ڛmhU}p}X|Z1(K/1z4QG_'ЬIlG{&Đ_6}w븢_f7}yt~*va]2Օ]<GDڹ{;h~6,YWc0#qJAb--ۗwR%#SC,ebξԹB9Õ5SJRkͩHhq]SeP29H76GPLS_l~qYU4ͦSfhgE AGv hDzso$wm¸owm %''o"y8 88_yYk] ֱGId{=fXd>" j;^"xhZc R)XX c 7;)m&X֞dOFsM]E7'~`βcK?̾yw-8bEo2sqpx8ݽAz+y"EW?7렼2Q$),%8*f*_ż4+UյjMc8)߃;MZ}tY["(`6L_{+aQ/IXQ@s/8EC:K}VUZ)o)xojoW&R_ uHTDSRF8׻%>TtI]져@ b[?ZK6 KA2 BJK.%g >Ȋ. {w7Pon"QB00IkqѢtI粠 4VL&2Q.D7vHX~'HTh8WiS՟K?ȗOi?Nz8|I ]Y D_QD%A/)]+^ŎApKo[1 #ou+d WSH06E.b@8b. x&Z +$sZAMgް=k}VvH$PB-@)%+1g1R7n%YRx!MT/!d5:"U59PحyL؈C-T"*bHuչQ;js3'( 2AR!ч RlHSI|+00m0 Cf駗$~xY.P22#ޟɎOZG4hUT/۝M밡fJe4'ЪaA2e}n[zF}*',x]k- IOkQi-qx+ kN++O5P5`O#_p"/0H$?:sF{kRk$S1Da ۘѕ5Z֠D B#",`z𾘝z$(p`La-̘ Jԅom7i/z\?OPimJz8oI ]YՊR>|ߨ%+*L8װ ~rcqqi=RzONi#bnk"U~;DMpa9n-rsΕ'0^R9hÍ`5*A.Myr\xj S9nn9n.5QrژBr…Nj$^e])bLPNXDŨuag2Z[׉՛-VT5L3̠=-aN4vP<&N*~ɝ=hٻ7r{?F_Vw?F[cİ?nwvξݭQ#ų_)zwo&P yĵGUTmAz;īk %1JjJ oTtϷ^s O%lbA{~r;>VASoz\?cOSzA)>\eg}gc[&@\'8U_2.v# BIDċ\ L1V*YNdg5խ=L8]]}0H)"zh>ptw.ʹÕ{ )ZI*s_rEݥKɡio˒"t@ ݠu:' 9p[i?H 8=:G' €r5 b"D'.GS} ly{nyWYMf28?WnqQ#/F^Djds{y=Bzrg81CpJ&eEY]{D4D) )7>滆g2*QzZkArT??$-*UPSt-I*3$h0Օ[#LʀFKbRW(I@B0qD&.!3qϮ[Kj/[l?u=dŐҒySg6Q#OP%.6ȔdȈ6 DU gY-oPHTFPwʼn~_!!@G%mKWRŁ+%ڍIC,j[P)Vd~/Z#iqf $ae8u&Б'hҹ^0 lz&ߢZunJ?%^\*8g`ӧBO෥9c[[Egb2P(8'= $c<X2TNMK6a*|H`N׃{iYL CϮ$jH2w+ǡ__=jY]M­x /YQ¯fmfo΀Sl.6 ,L䯶`~p4P}c"-A# 3>? YXsw{4sv:IDS@)M S@ut{"Tt 2Nav`5;nrƱ(,â@<Ѣq̧>ߢZS%˶(<ѢhQXbQ8gXbDAR+NRjjT4ĕ"`̾=TQÌ/*ʒ"F1yC9LЈW+'-]w1PH  Yց/?ը`}/-?O&kˇKu/4- nv`&01bDua^ rvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005644132415155206270017711 0ustar rootrootMar 14 05:26:55 crc systemd[1]: Starting Kubernetes Kubelet... Mar 14 05:26:55 crc restorecon[4703]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:55 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 05:26:56 crc restorecon[4703]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 14 05:26:57 crc kubenswrapper[4713]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.332978 4713 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342001 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342033 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342038 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342043 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342047 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342052 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342055 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342059 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342063 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342068 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342074 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342081 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342085 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342089 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342093 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342097 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342102 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342105 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342110 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342114 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342118 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342122 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342126 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342133 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342137 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342141 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342144 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342148 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342151 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342155 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342158 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342163 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342166 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342170 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342173 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342177 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342180 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342184 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342187 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342191 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342194 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342198 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342223 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342230 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342234 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342238 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342243 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342247 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342251 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342254 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342258 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342261 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342265 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342268 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342272 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342276 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342279 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342283 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342286 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342294 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342299 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342302 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342306 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342309 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342312 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342316 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342319 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342323 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342326 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342329 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.342333 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342438 4713 flags.go:64] FLAG: --address="0.0.0.0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342451 4713 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342466 4713 flags.go:64] FLAG: --anonymous-auth="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342491 4713 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342501 4713 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342506 4713 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342514 4713 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342522 4713 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342527 4713 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342533 4713 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342539 4713 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342544 4713 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342549 4713 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342555 4713 flags.go:64] FLAG: --cgroup-root="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342560 4713 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342565 4713 flags.go:64] FLAG: --client-ca-file="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342569 4713 flags.go:64] FLAG: --cloud-config="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342574 4713 flags.go:64] FLAG: --cloud-provider="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342579 4713 flags.go:64] FLAG: --cluster-dns="[]" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342585 4713 flags.go:64] FLAG: --cluster-domain="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342590 4713 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342595 4713 flags.go:64] FLAG: --config-dir="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342599 4713 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342605 4713 flags.go:64] FLAG: --container-log-max-files="5" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342613 4713 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342618 4713 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342623 4713 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342628 4713 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342634 4713 flags.go:64] FLAG: --contention-profiling="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342638 4713 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342643 4713 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342649 4713 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342654 4713 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342665 4713 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342670 4713 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342675 4713 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342680 4713 flags.go:64] FLAG: --enable-load-reader="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342685 4713 flags.go:64] FLAG: --enable-server="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342692 4713 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342699 4713 flags.go:64] FLAG: --event-burst="100" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342705 4713 flags.go:64] FLAG: --event-qps="50" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342709 4713 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342714 4713 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342719 4713 flags.go:64] FLAG: --eviction-hard="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342727 4713 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342732 4713 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342737 4713 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342742 4713 flags.go:64] FLAG: --eviction-soft="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342748 4713 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342753 4713 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342758 4713 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342763 4713 flags.go:64] FLAG: --experimental-mounter-path="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342768 4713 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342774 4713 flags.go:64] FLAG: --fail-swap-on="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342779 4713 flags.go:64] FLAG: --feature-gates="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342785 4713 flags.go:64] FLAG: --file-check-frequency="20s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342790 4713 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342796 4713 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342801 4713 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342807 4713 flags.go:64] FLAG: --healthz-port="10248" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342817 4713 flags.go:64] FLAG: --help="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342822 4713 flags.go:64] FLAG: --hostname-override="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342827 4713 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342833 4713 flags.go:64] FLAG: --http-check-frequency="20s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342838 4713 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342843 4713 flags.go:64] FLAG: --image-credential-provider-config="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342848 4713 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342853 4713 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342859 4713 flags.go:64] FLAG: --image-service-endpoint="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342864 4713 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342871 4713 flags.go:64] FLAG: --kube-api-burst="100" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342876 4713 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342882 4713 flags.go:64] FLAG: --kube-api-qps="50" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342887 4713 flags.go:64] FLAG: --kube-reserved="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342893 4713 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342899 4713 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342905 4713 flags.go:64] FLAG: --kubelet-cgroups="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342910 4713 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342915 4713 flags.go:64] FLAG: --lock-file="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342920 4713 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342926 4713 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342931 4713 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342940 4713 flags.go:64] FLAG: --log-json-split-stream="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342945 4713 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342950 4713 flags.go:64] FLAG: --log-text-split-stream="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342956 4713 flags.go:64] FLAG: --logging-format="text" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342961 4713 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342967 4713 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342972 4713 flags.go:64] FLAG: --manifest-url="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342977 4713 flags.go:64] FLAG: --manifest-url-header="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342984 4713 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342990 4713 flags.go:64] FLAG: --max-open-files="1000000" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.342997 4713 flags.go:64] FLAG: --max-pods="110" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343002 4713 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343007 4713 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343012 4713 flags.go:64] FLAG: --memory-manager-policy="None" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343019 4713 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343024 4713 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343029 4713 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343035 4713 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343050 4713 flags.go:64] FLAG: --node-status-max-images="50" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343056 4713 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343063 4713 flags.go:64] FLAG: --oom-score-adj="-999" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343069 4713 flags.go:64] FLAG: --pod-cidr="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343075 4713 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343084 4713 flags.go:64] FLAG: --pod-manifest-path="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343089 4713 flags.go:64] FLAG: --pod-max-pids="-1" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343094 4713 flags.go:64] FLAG: --pods-per-core="0" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343100 4713 flags.go:64] FLAG: --port="10250" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343105 4713 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343110 4713 flags.go:64] FLAG: --provider-id="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343116 4713 flags.go:64] FLAG: --qos-reserved="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343121 4713 flags.go:64] FLAG: --read-only-port="10255" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343127 4713 flags.go:64] FLAG: --register-node="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343132 4713 flags.go:64] FLAG: --register-schedulable="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343137 4713 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343147 4713 flags.go:64] FLAG: --registry-burst="10" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343152 4713 flags.go:64] FLAG: --registry-qps="5" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343157 4713 flags.go:64] FLAG: --reserved-cpus="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343163 4713 flags.go:64] FLAG: --reserved-memory="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343170 4713 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343175 4713 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343181 4713 flags.go:64] FLAG: --rotate-certificates="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343186 4713 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343191 4713 flags.go:64] FLAG: --runonce="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343196 4713 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343220 4713 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343227 4713 flags.go:64] FLAG: --seccomp-default="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343232 4713 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343238 4713 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343243 4713 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343248 4713 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343256 4713 flags.go:64] FLAG: --storage-driver-password="root" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343266 4713 flags.go:64] FLAG: --storage-driver-secure="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343271 4713 flags.go:64] FLAG: --storage-driver-table="stats" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343277 4713 flags.go:64] FLAG: --storage-driver-user="root" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343282 4713 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343288 4713 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343294 4713 flags.go:64] FLAG: --system-cgroups="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343299 4713 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343308 4713 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343313 4713 flags.go:64] FLAG: --tls-cert-file="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343318 4713 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343325 4713 flags.go:64] FLAG: --tls-min-version="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343330 4713 flags.go:64] FLAG: --tls-private-key-file="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343335 4713 flags.go:64] FLAG: --topology-manager-policy="none" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343340 4713 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343345 4713 flags.go:64] FLAG: --topology-manager-scope="container" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343351 4713 flags.go:64] FLAG: --v="2" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343359 4713 flags.go:64] FLAG: --version="false" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343366 4713 flags.go:64] FLAG: --vmodule="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343373 4713 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343380 4713 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343509 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343517 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343522 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343527 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343531 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343537 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343542 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343546 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343551 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343556 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343560 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343565 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343574 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343578 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343583 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343588 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343593 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343598 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343603 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343607 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343612 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343617 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343621 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343626 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343630 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343634 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343639 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343645 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343651 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343656 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343661 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343665 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343669 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343674 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343678 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343683 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343687 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343692 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343696 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343700 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343704 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343710 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343716 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343723 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343731 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343736 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343742 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343747 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343751 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343756 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343760 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343766 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343770 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343774 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343779 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343785 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343790 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343795 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343799 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343804 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343808 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343812 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343817 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343822 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343826 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343830 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343835 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343840 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343844 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343850 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.343855 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.343872 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.357145 4713 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.357196 4713 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357304 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357314 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357319 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357454 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357461 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357465 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357470 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357475 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357480 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357484 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357487 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357492 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357496 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357501 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357507 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357512 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357516 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357520 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357524 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357528 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357532 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357536 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357541 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357546 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357551 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357556 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357561 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357567 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357572 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357576 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357580 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357586 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357612 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357620 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357627 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357632 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357638 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357643 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357647 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357651 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357656 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357660 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357665 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357670 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357675 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357681 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357686 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357692 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357698 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357703 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357709 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357715 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357722 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357727 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357733 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357738 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357743 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357748 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357753 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357763 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357768 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357774 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357779 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357784 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357788 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357793 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357798 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357802 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357807 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357812 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.357817 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.357826 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358003 4713 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358015 4713 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358020 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358025 4713 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358030 4713 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358036 4713 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358040 4713 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358045 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358050 4713 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358054 4713 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358058 4713 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358063 4713 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358067 4713 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358072 4713 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358076 4713 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358080 4713 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358085 4713 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358091 4713 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358099 4713 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358105 4713 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358111 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358116 4713 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358122 4713 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358127 4713 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358132 4713 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358137 4713 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358141 4713 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358147 4713 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358152 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358157 4713 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358161 4713 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358166 4713 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358171 4713 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358175 4713 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358180 4713 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358184 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358189 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358194 4713 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358199 4713 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358224 4713 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358230 4713 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358235 4713 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358239 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358244 4713 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358248 4713 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358253 4713 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358257 4713 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358262 4713 feature_gate.go:330] unrecognized feature gate: Example Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358266 4713 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358271 4713 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358275 4713 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358280 4713 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358286 4713 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358290 4713 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358297 4713 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358303 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358308 4713 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358313 4713 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358356 4713 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358362 4713 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358367 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358373 4713 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358377 4713 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358382 4713 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358387 4713 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358391 4713 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358396 4713 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358401 4713 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358405 4713 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358410 4713 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.358416 4713 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.358423 4713 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.358648 4713 server.go:940] "Client rotation is on, will bootstrap in background" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.364732 4713 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.372396 4713 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.372545 4713 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.375571 4713 server.go:997] "Starting client certificate rotation" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.375621 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.375783 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.401881 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.404127 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.405397 4713 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.419546 4713 log.go:25] "Validated CRI v1 runtime API" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.455177 4713 log.go:25] "Validated CRI v1 image API" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.456703 4713 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.462335 4713 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-14-05-21-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.462386 4713 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.484319 4713 manager.go:217] Machine: {Timestamp:2026-03-14 05:26:57.480033521 +0000 UTC m=+0.567942861 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ebb26f6c-82bb-440f-9e91-918044368ed3 BootID:4bde9573-5f81-4b74-9efd-4a4fc782d1ac Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:94:8d:b1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:94:8d:b1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a9:75:15 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:dc:bd:1c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:dd:a2:37 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:36:a0:86 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:7e:7d:3e:74:f0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:33:64:61:08:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.484632 4713 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.484898 4713 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.486440 4713 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.486675 4713 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.486729 4713 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.487004 4713 topology_manager.go:138] "Creating topology manager with none policy" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.487019 4713 container_manager_linux.go:303] "Creating device plugin manager" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.487642 4713 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.487680 4713 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.488474 4713 state_mem.go:36] "Initialized new in-memory state store" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.488869 4713 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.492385 4713 kubelet.go:418] "Attempting to sync node with API server" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.492423 4713 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.492457 4713 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.492476 4713 kubelet.go:324] "Adding apiserver pod source" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.492490 4713 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.498320 4713 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.499165 4713 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.499680 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.499774 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.500001 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.500259 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.500589 4713 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.501987 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502016 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502026 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502036 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502053 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502061 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502069 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502082 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502092 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502102 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502118 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.502127 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.503317 4713 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.503870 4713 server.go:1280] "Started kubelet" Mar 14 05:26:57 crc systemd[1]: Started Kubernetes Kubelet. Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.506036 4713 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.506398 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.506180 4713 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.507454 4713 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.509567 4713 server.go:460] "Adding debug handlers to kubelet server" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.509690 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.509740 4713 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.510128 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.510226 4713 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.510234 4713 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.510378 4713 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.509324 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.511556 4713 factory.go:55] Registering systemd factory Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.511610 4713 factory.go:221] Registration of the systemd container factory successfully Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.512614 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.512660 4713 factory.go:153] Registering CRI-O factory Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.512687 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.512695 4713 factory.go:221] Registration of the crio container factory successfully Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.512784 4713 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.512807 4713 factory.go:103] Registering Raw factory Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.512830 4713 manager.go:1196] Started watching for new ooms in manager Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.512802 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.521098 4713 manager.go:319] Starting recovery of all containers Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527410 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527489 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527505 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527519 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527531 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527543 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527557 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527570 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527587 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527600 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527613 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527625 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527637 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527651 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527663 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527695 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527706 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527718 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527729 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527741 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527755 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527767 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527779 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527792 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527807 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527820 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527851 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527865 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527878 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527890 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527903 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527941 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527953 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527966 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.527978 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528015 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528028 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528045 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528060 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528073 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528089 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528121 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528134 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528153 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528166 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528178 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528190 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528243 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528257 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528270 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528289 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528301 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528331 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528345 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528365 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528384 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528405 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528426 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528439 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528461 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528473 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528490 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528509 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528521 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528533 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528546 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528559 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528579 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528592 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528603 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528615 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528628 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528641 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528653 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528674 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528686 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528699 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528712 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528726 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528744 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528762 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528773 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528785 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.528803 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531044 4713 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531124 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531150 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531163 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531182 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531195 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531223 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531237 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531248 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531297 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531331 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531344 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531360 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531377 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531412 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531427 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531440 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531455 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531471 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531492 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531591 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531679 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531699 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531719 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531736 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531755 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531773 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531795 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531847 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531864 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531889 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531911 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531928 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531976 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.531991 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532004 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532048 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532063 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532077 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532089 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532133 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532148 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532288 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532307 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532389 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532413 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532450 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532463 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532479 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532493 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532508 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532520 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532554 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532566 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532580 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532593 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532607 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532620 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532634 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532648 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532686 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532734 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532764 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532777 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532789 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532801 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532853 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532865 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532900 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532913 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532926 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532940 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532953 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.532968 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533025 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533042 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533077 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533092 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533124 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533140 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533154 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533167 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533182 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533198 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533278 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533315 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533330 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533348 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533364 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533435 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533452 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533465 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533502 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533520 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533533 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533547 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533562 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533578 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533687 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533704 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533758 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533771 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533783 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533838 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533922 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533956 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.533989 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534002 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534033 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534049 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534088 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534102 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534117 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534137 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534151 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534168 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534222 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534298 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534314 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534327 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534340 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534393 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534410 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534429 4713 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534475 4713 reconstruct.go:97] "Volume reconstruction finished" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.534487 4713 reconciler.go:26] "Reconciler: start to sync state" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.539136 4713 manager.go:324] Recovery completed Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.549621 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.552896 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.552928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.552937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.553844 4713 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.553915 4713 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.553977 4713 state_mem.go:36] "Initialized new in-memory state store" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.560416 4713 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.562333 4713 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.562389 4713 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.562418 4713 kubelet.go:2335] "Starting kubelet main sync loop" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.562478 4713 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 05:26:57 crc kubenswrapper[4713]: W0314 05:26:57.563732 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.563771 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.574681 4713 policy_none.go:49] "None policy: Start" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.575757 4713 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.575805 4713 state_mem.go:35] "Initializing new in-memory state store" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.611123 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.634453 4713 manager.go:334] "Starting Device Plugin manager" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.634533 4713 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.634550 4713 server.go:79] "Starting device plugin registration server" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.635058 4713 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.635081 4713 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.635426 4713 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.635562 4713 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.635575 4713 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.641700 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.662859 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.663009 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664279 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664487 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.664708 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.665900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.665928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.665938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666109 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666345 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666406 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666666 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.666706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667265 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667336 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667360 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667449 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667478 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.667499 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668447 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668648 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668872 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.668909 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.669390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.669417 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.669426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.669592 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.669614 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670272 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.670419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.713552 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735186 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735825 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735863 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735895 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735916 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.735975 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736032 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736082 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736194 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736249 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736293 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736372 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.736598 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.736999 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837476 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837495 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837541 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837564 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837588 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837637 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837661 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837738 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837743 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837782 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837661 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837792 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837909 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837918 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837936 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837997 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.837981 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838008 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838039 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838231 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838252 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838266 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838350 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838396 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.838385 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.937532 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.939066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.939241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.939265 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.939354 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:26:57 crc kubenswrapper[4713]: E0314 05:26:57.940586 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.991540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 05:26:57 crc kubenswrapper[4713]: I0314 05:26:57.997525 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.026945 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.051936 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.053339 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cab0cda8668cce2d017ab92178396c46235e8cfb4efa85e30754bb43229fb2e5 WatchSource:0}: Error finding container cab0cda8668cce2d017ab92178396c46235e8cfb4efa85e30754bb43229fb2e5: Status 404 returned error can't find the container with id cab0cda8668cce2d017ab92178396c46235e8cfb4efa85e30754bb43229fb2e5 Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.061336 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.068182 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-91f9c4521bcbef23072fba085b202911d464d4b2dc7df0216cec9895ac0d0760 WatchSource:0}: Error finding container 91f9c4521bcbef23072fba085b202911d464d4b2dc7df0216cec9895ac0d0760: Status 404 returned error can't find the container with id 91f9c4521bcbef23072fba085b202911d464d4b2dc7df0216cec9895ac0d0760 Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.072911 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-583e1e45fac6eb1007f1f64464a585e67be19ce3f251b9c72acc5cf5890dc139 WatchSource:0}: Error finding container 583e1e45fac6eb1007f1f64464a585e67be19ce3f251b9c72acc5cf5890dc139: Status 404 returned error can't find the container with id 583e1e45fac6eb1007f1f64464a585e67be19ce3f251b9c72acc5cf5890dc139 Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.080251 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-34e80f74f06ee94d8fa5f07672b66d6b7a2084e948d563584b21bca148bd102d WatchSource:0}: Error finding container 34e80f74f06ee94d8fa5f07672b66d6b7a2084e948d563584b21bca148bd102d: Status 404 returned error can't find the container with id 34e80f74f06ee94d8fa5f07672b66d6b7a2084e948d563584b21bca148bd102d Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.115372 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.341051 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.342679 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.342714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.342726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.342750 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.343157 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.507300 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.566843 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"995605e91a1b3de8998a0f6cbb7cba185d9874698de8178731268556a6fb262b"} Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.568566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cab0cda8668cce2d017ab92178396c46235e8cfb4efa85e30754bb43229fb2e5"} Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.569598 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34e80f74f06ee94d8fa5f07672b66d6b7a2084e948d563584b21bca148bd102d"} Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.570608 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"583e1e45fac6eb1007f1f64464a585e67be19ce3f251b9c72acc5cf5890dc139"} Mar 14 05:26:58 crc kubenswrapper[4713]: I0314 05:26:58.571826 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91f9c4521bcbef23072fba085b202911d464d4b2dc7df0216cec9895ac0d0760"} Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.633275 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.633339 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.683393 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.683473 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.817077 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.817180 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.916261 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Mar 14 05:26:58 crc kubenswrapper[4713]: W0314 05:26:58.942958 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:58 crc kubenswrapper[4713]: E0314 05:26:58.943062 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.143717 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.145087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.145127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.145136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.145180 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:26:59 crc kubenswrapper[4713]: E0314 05:26:59.145735 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.507476 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.527707 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:26:59 crc kubenswrapper[4713]: E0314 05:26:59.528623 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.576425 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20" exitCode=0 Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.576498 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.576558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.577494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.577569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.577621 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.578667 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b" exitCode=0 Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.578769 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.579691 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.580612 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.580767 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.580808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.580827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582335 4713 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4" exitCode=0 Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.582993 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.585743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.585798 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.585819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.589933 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb5a6e7fb2309ccbbdf3ee09496cec1939b803616e46bd5080e12d654f97c299"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.590179 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3251da253d7431414854ca2550b3bed65f90d2a661cd6c42fb6c0770bcb84421"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.590435 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.590617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.589984 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.591387 4713 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf" exitCode=0 Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.591480 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf"} Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.591506 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.593570 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.593607 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.593619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.594654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.594685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.594695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.673163 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:26:59 crc kubenswrapper[4713]: I0314 05:26:59.949063 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.267009 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.332950 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.507597 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:27:00 crc kubenswrapper[4713]: E0314 05:27:00.517674 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.595978 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.596155 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.596522 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.596547 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.597015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.597032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.597044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.601512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.601965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.602058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.602141 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.604501 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1" exitCode=0 Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.604660 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.605101 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.607310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.607338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.607349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.610892 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.611317 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.611559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b"} Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612638 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.612644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.746684 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.748699 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.748745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.748754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:00 crc kubenswrapper[4713]: I0314 05:27:00.748777 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:00 crc kubenswrapper[4713]: E0314 05:27:00.749143 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 14 05:27:01 crc kubenswrapper[4713]: W0314 05:27:01.011849 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 14 05:27:01 crc kubenswrapper[4713]: E0314 05:27:01.011954 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.617819 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6df6314df72163aacd594733c524dc8ba778789b5ba14eba6a99cc795a4a3b54"} Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.617904 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.619129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.619160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.619170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621021 4713 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7" exitCode=0 Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621133 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621145 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7"} Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621166 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621135 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621242 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.621337 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622248 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622283 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622375 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622403 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:01 crc kubenswrapper[4713]: I0314 05:27:01.622810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.627917 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.627963 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43edeb2fb633e59075be5441af86a4900252c13999743027187349c45a9818c1"} Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628415 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a8cf31a51e4aed330e2243890414f6e77e0b2376c7d10d29bb181c11a004ebe6"} Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aff7c162dc04b24c00fe296bd9f54f629666770337c5b5a1758ad7c1ffc2b313"} Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628445 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"15eeeed96a6933a36b7b173b63deb9670a36b6be7ab19abec94835db3e568aa2"} Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628816 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:02 crc kubenswrapper[4713]: I0314 05:27:02.628898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.639499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c28147b0d7173eb1ae6af953e5ceb55a4d79ef66d2fc79470b427a3054b8b30e"} Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.639772 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.641508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.641547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.641558 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.683090 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.950149 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.951510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.951552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.951564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.951585 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:03 crc kubenswrapper[4713]: I0314 05:27:03.973244 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.262850 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.263030 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.264597 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.264639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.264648 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.641264 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.642262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.642286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.642296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.683271 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.683458 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.684611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.684644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.684653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.792727 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.792934 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.794187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.794303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:04 crc kubenswrapper[4713]: I0314 05:27:04.794332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.317831 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.643927 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.643975 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645163 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:05 crc kubenswrapper[4713]: I0314 05:27:05.645356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.135432 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.135874 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.138148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.138259 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.138289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:07 crc kubenswrapper[4713]: E0314 05:27:07.641806 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.792858 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:07 crc kubenswrapper[4713]: I0314 05:27:07.792985 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:09 crc kubenswrapper[4713]: I0314 05:27:09.802726 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 14 05:27:09 crc kubenswrapper[4713]: I0314 05:27:09.803566 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:09 crc kubenswrapper[4713]: I0314 05:27:09.804656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:09 crc kubenswrapper[4713]: I0314 05:27:09.804695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:09 crc kubenswrapper[4713]: I0314 05:27:09.804706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:10 crc kubenswrapper[4713]: I0314 05:27:10.340662 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:10 crc kubenswrapper[4713]: I0314 05:27:10.340909 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:10 crc kubenswrapper[4713]: I0314 05:27:10.342137 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:10 crc kubenswrapper[4713]: I0314 05:27:10.342174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:10 crc kubenswrapper[4713]: I0314 05:27:10.342183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:11 crc kubenswrapper[4713]: W0314 05:27:11.334896 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.335005 4713 trace.go:236] Trace[2145826983]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Mar-2026 05:27:01.333) (total time: 10001ms): Mar 14 05:27:11 crc kubenswrapper[4713]: Trace[2145826983]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:27:11.334) Mar 14 05:27:11 crc kubenswrapper[4713]: Trace[2145826983]: [10.001718996s] [10.001718996s] END Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.335031 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 14 05:27:11 crc kubenswrapper[4713]: W0314 05:27:11.344531 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.344644 4713 trace.go:236] Trace[1679901384]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Mar-2026 05:27:01.343) (total time: 10001ms): Mar 14 05:27:11 crc kubenswrapper[4713]: Trace[1679901384]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:27:11.344) Mar 14 05:27:11 crc kubenswrapper[4713]: Trace[1679901384]: [10.00156298s] [10.00156298s] END Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.344665 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 14 05:27:11 crc kubenswrapper[4713]: W0314 05:27:11.445449 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.445530 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.447848 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.449125 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.450295 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.450767 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.452625 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.455226 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.455286 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 05:27:11 crc kubenswrapper[4713]: W0314 05:27:11.457460 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z Mar 14 05:27:11 crc kubenswrapper[4713]: E0314 05:27:11.457534 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.464751 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.464820 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.513348 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:11Z is after 2026-02-23T05:33:13Z Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.662893 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.664989 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6df6314df72163aacd594733c524dc8ba778789b5ba14eba6a99cc795a4a3b54" exitCode=255 Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.665033 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6df6314df72163aacd594733c524dc8ba778789b5ba14eba6a99cc795a4a3b54"} Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.665166 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.666116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.666155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.666168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:11 crc kubenswrapper[4713]: I0314 05:27:11.670828 4713 scope.go:117] "RemoveContainer" containerID="6df6314df72163aacd594733c524dc8ba778789b5ba14eba6a99cc795a4a3b54" Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.510379 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:12Z is after 2026-02-23T05:33:13Z Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.669690 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.671074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a"} Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.671186 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.671825 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.671844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:12 crc kubenswrapper[4713]: I0314 05:27:12.671853 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.511819 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:13Z is after 2026-02-23T05:33:13Z Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.676135 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.676769 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.678348 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" exitCode=255 Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.678393 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a"} Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.678443 4713 scope.go:117] "RemoveContainer" containerID="6df6314df72163aacd594733c524dc8ba778789b5ba14eba6a99cc795a4a3b54" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.678685 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.679799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.679826 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.679836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:13 crc kubenswrapper[4713]: I0314 05:27:13.680395 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:13 crc kubenswrapper[4713]: E0314 05:27:13.681597 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.262914 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.510627 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:14Z is after 2026-02-23T05:33:13Z Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.681978 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.683854 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.684661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.684693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.684703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.685243 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:14 crc kubenswrapper[4713]: E0314 05:27:14.686063 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:14 crc kubenswrapper[4713]: I0314 05:27:14.688931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:14 crc kubenswrapper[4713]: W0314 05:27:14.793457 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:14Z is after 2026-02-23T05:33:13Z Mar 14 05:27:14 crc kubenswrapper[4713]: E0314 05:27:14.794368 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:15 crc kubenswrapper[4713]: W0314 05:27:15.004906 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:15Z is after 2026-02-23T05:33:13Z Mar 14 05:27:15 crc kubenswrapper[4713]: E0314 05:27:15.005039 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.184622 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.322323 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:15 crc kubenswrapper[4713]: W0314 05:27:15.347266 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:15Z is after 2026-02-23T05:33:13Z Mar 14 05:27:15 crc kubenswrapper[4713]: E0314 05:27:15.347397 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.510489 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:15Z is after 2026-02-23T05:33:13Z Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.687417 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.688809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.688874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.688889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:15 crc kubenswrapper[4713]: I0314 05:27:15.689866 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:15 crc kubenswrapper[4713]: E0314 05:27:15.690317 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.509828 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:16Z is after 2026-02-23T05:33:13Z Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.689466 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.690349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.690401 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.690415 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:16 crc kubenswrapper[4713]: I0314 05:27:16.691087 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:16 crc kubenswrapper[4713]: E0314 05:27:16.691284 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.510161 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:17Z is after 2026-02-23T05:33:13Z Mar 14 05:27:17 crc kubenswrapper[4713]: E0314 05:27:17.641931 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.793978 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.794090 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.853577 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.854611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.854640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.854649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:17 crc kubenswrapper[4713]: I0314 05:27:17.854670 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:17 crc kubenswrapper[4713]: E0314 05:27:17.854704 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 05:27:17 crc kubenswrapper[4713]: E0314 05:27:17.858754 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:18 crc kubenswrapper[4713]: I0314 05:27:18.510888 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:18Z is after 2026-02-23T05:33:13Z Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.511726 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:19Z is after 2026-02-23T05:33:13Z Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.591413 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:27:19 crc kubenswrapper[4713]: E0314 05:27:19.599004 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:19 crc kubenswrapper[4713]: W0314 05:27:19.740255 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:19Z is after 2026-02-23T05:33:13Z Mar 14 05:27:19 crc kubenswrapper[4713]: E0314 05:27:19.740371 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.832121 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.832421 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.833837 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.833870 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.833881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:19 crc kubenswrapper[4713]: I0314 05:27:19.843313 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 14 05:27:20 crc kubenswrapper[4713]: I0314 05:27:20.509758 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:20Z is after 2026-02-23T05:33:13Z Mar 14 05:27:20 crc kubenswrapper[4713]: I0314 05:27:20.703469 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:20 crc kubenswrapper[4713]: I0314 05:27:20.704484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:20 crc kubenswrapper[4713]: I0314 05:27:20.704547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:20 crc kubenswrapper[4713]: I0314 05:27:20.704562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:21 crc kubenswrapper[4713]: E0314 05:27:21.455828 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:27:21 crc kubenswrapper[4713]: I0314 05:27:21.509526 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:21Z is after 2026-02-23T05:33:13Z Mar 14 05:27:22 crc kubenswrapper[4713]: I0314 05:27:22.512337 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:22Z is after 2026-02-23T05:33:13Z Mar 14 05:27:23 crc kubenswrapper[4713]: W0314 05:27:23.273047 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:23Z is after 2026-02-23T05:33:13Z Mar 14 05:27:23 crc kubenswrapper[4713]: E0314 05:27:23.273156 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:23 crc kubenswrapper[4713]: I0314 05:27:23.510858 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:23Z is after 2026-02-23T05:33:13Z Mar 14 05:27:23 crc kubenswrapper[4713]: W0314 05:27:23.767107 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:23Z is after 2026-02-23T05:33:13Z Mar 14 05:27:23 crc kubenswrapper[4713]: E0314 05:27:23.767240 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:24 crc kubenswrapper[4713]: W0314 05:27:24.412956 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:24Z is after 2026-02-23T05:33:13Z Mar 14 05:27:24 crc kubenswrapper[4713]: E0314 05:27:24.413039 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.511002 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:24Z is after 2026-02-23T05:33:13Z Mar 14 05:27:24 crc kubenswrapper[4713]: E0314 05:27:24.857819 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.859002 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.860649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.860683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.860695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:24 crc kubenswrapper[4713]: I0314 05:27:24.860718 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:24 crc kubenswrapper[4713]: E0314 05:27:24.863251 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:25 crc kubenswrapper[4713]: I0314 05:27:25.512764 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:25Z is after 2026-02-23T05:33:13Z Mar 14 05:27:26 crc kubenswrapper[4713]: I0314 05:27:26.512766 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:26Z is after 2026-02-23T05:33:13Z Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.510054 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:27Z is after 2026-02-23T05:33:13Z Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.563586 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.564697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.564746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.564760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.565558 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:27 crc kubenswrapper[4713]: E0314 05:27:27.642051 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.793560 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.793662 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.793721 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.793851 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.795177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.795231 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.795243 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.795642 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 05:27:27 crc kubenswrapper[4713]: I0314 05:27:27.795773 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12" gracePeriod=30 Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.510329 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:28Z is after 2026-02-23T05:33:13Z Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.724434 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.724826 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.726601 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" exitCode=255 Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.726660 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe"} Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.726696 4713 scope.go:117] "RemoveContainer" containerID="3e4029fa53d03c5549f5805f7a2ebd4c652adb0bbf2c2ae4102f4714dc5a139a" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.726809 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.732731 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.732756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.732767 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.733391 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:28 crc kubenswrapper[4713]: E0314 05:27:28.733555 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.737515 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.737929 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12" exitCode=255 Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.737982 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12"} Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.738020 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac"} Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.738139 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.738954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.738979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:28 crc kubenswrapper[4713]: I0314 05:27:28.738989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:29 crc kubenswrapper[4713]: I0314 05:27:29.512795 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:29Z is after 2026-02-23T05:33:13Z Mar 14 05:27:29 crc kubenswrapper[4713]: I0314 05:27:29.742981 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.267364 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.267627 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.269104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.269184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.269195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:30 crc kubenswrapper[4713]: I0314 05:27:30.511863 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:30Z is after 2026-02-23T05:33:13Z Mar 14 05:27:31 crc kubenswrapper[4713]: E0314 05:27:31.460942 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.510604 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:31Z is after 2026-02-23T05:33:13Z Mar 14 05:27:31 crc kubenswrapper[4713]: E0314 05:27:31.863386 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.863402 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.864683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.864736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.864749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:31 crc kubenswrapper[4713]: I0314 05:27:31.864782 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:31 crc kubenswrapper[4713]: E0314 05:27:31.868225 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:32 crc kubenswrapper[4713]: I0314 05:27:32.510865 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:32Z is after 2026-02-23T05:33:13Z Mar 14 05:27:33 crc kubenswrapper[4713]: I0314 05:27:33.510704 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:33Z is after 2026-02-23T05:33:13Z Mar 14 05:27:34 crc kubenswrapper[4713]: W0314 05:27:34.142352 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:34Z is after 2026-02-23T05:33:13Z Mar 14 05:27:34 crc kubenswrapper[4713]: E0314 05:27:34.142479 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.263381 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.263551 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.264807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.264858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.264874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.265562 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:34 crc kubenswrapper[4713]: E0314 05:27:34.265767 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.512031 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:34Z is after 2026-02-23T05:33:13Z Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.792792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.792948 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.794118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.794160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:34 crc kubenswrapper[4713]: I0314 05:27:34.794168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.184266 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.184455 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.185605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.185651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.185664 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.186332 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:35 crc kubenswrapper[4713]: E0314 05:27:35.186547 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:35 crc kubenswrapper[4713]: I0314 05:27:35.509582 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:35Z is after 2026-02-23T05:33:13Z Mar 14 05:27:36 crc kubenswrapper[4713]: I0314 05:27:36.456097 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:27:36 crc kubenswrapper[4713]: E0314 05:27:36.460997 4713 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:36 crc kubenswrapper[4713]: E0314 05:27:36.462234 4713 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 14 05:27:36 crc kubenswrapper[4713]: I0314 05:27:36.510108 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:36Z is after 2026-02-23T05:33:13Z Mar 14 05:27:37 crc kubenswrapper[4713]: I0314 05:27:37.510181 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:37Z is after 2026-02-23T05:33:13Z Mar 14 05:27:37 crc kubenswrapper[4713]: E0314 05:27:37.642171 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:37 crc kubenswrapper[4713]: I0314 05:27:37.792857 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:37 crc kubenswrapper[4713]: I0314 05:27:37.792928 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.512518 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:38Z is after 2026-02-23T05:33:13Z Mar 14 05:27:38 crc kubenswrapper[4713]: E0314 05:27:38.867333 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.868344 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.870196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.870277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.870295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:38 crc kubenswrapper[4713]: I0314 05:27:38.870331 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:38 crc kubenswrapper[4713]: E0314 05:27:38.873969 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:39 crc kubenswrapper[4713]: I0314 05:27:39.511640 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:39Z is after 2026-02-23T05:33:13Z Mar 14 05:27:40 crc kubenswrapper[4713]: I0314 05:27:40.512281 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:40Z is after 2026-02-23T05:33:13Z Mar 14 05:27:41 crc kubenswrapper[4713]: E0314 05:27:41.466056 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:27:41 crc kubenswrapper[4713]: I0314 05:27:41.511996 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:41Z is after 2026-02-23T05:33:13Z Mar 14 05:27:42 crc kubenswrapper[4713]: I0314 05:27:42.512745 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:42Z is after 2026-02-23T05:33:13Z Mar 14 05:27:42 crc kubenswrapper[4713]: W0314 05:27:42.759638 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:42Z is after 2026-02-23T05:33:13Z Mar 14 05:27:42 crc kubenswrapper[4713]: E0314 05:27:42.759775 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:43 crc kubenswrapper[4713]: I0314 05:27:43.509945 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:43Z is after 2026-02-23T05:33:13Z Mar 14 05:27:44 crc kubenswrapper[4713]: I0314 05:27:44.510995 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:44Z is after 2026-02-23T05:33:13Z Mar 14 05:27:44 crc kubenswrapper[4713]: W0314 05:27:44.651526 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:44Z is after 2026-02-23T05:33:13Z Mar 14 05:27:44 crc kubenswrapper[4713]: E0314 05:27:44.651628 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.510924 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:45Z is after 2026-02-23T05:33:13Z Mar 14 05:27:45 crc kubenswrapper[4713]: E0314 05:27:45.872248 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:45Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.874307 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.875580 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.875615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.875624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:45 crc kubenswrapper[4713]: I0314 05:27:45.875645 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:45 crc kubenswrapper[4713]: E0314 05:27:45.880551 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 05:27:46 crc kubenswrapper[4713]: W0314 05:27:46.016754 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:46Z is after 2026-02-23T05:33:13Z Mar 14 05:27:46 crc kubenswrapper[4713]: E0314 05:27:46.017280 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.510852 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:46Z is after 2026-02-23T05:33:13Z Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.563300 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.564735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.564793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.564812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:46 crc kubenswrapper[4713]: I0314 05:27:46.565776 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:46 crc kubenswrapper[4713]: E0314 05:27:46.566076 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.143197 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.143496 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.144921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.144984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.144998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.509105 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:47Z is after 2026-02-23T05:33:13Z Mar 14 05:27:47 crc kubenswrapper[4713]: E0314 05:27:47.642626 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.794440 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:47 crc kubenswrapper[4713]: I0314 05:27:47.794627 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:48 crc kubenswrapper[4713]: I0314 05:27:48.511609 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:48Z is after 2026-02-23T05:33:13Z Mar 14 05:27:49 crc kubenswrapper[4713]: I0314 05:27:49.510953 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:49Z is after 2026-02-23T05:33:13Z Mar 14 05:27:50 crc kubenswrapper[4713]: I0314 05:27:50.510461 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:50Z is after 2026-02-23T05:33:13Z Mar 14 05:27:51 crc kubenswrapper[4713]: E0314 05:27:51.470106 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:27:51 crc kubenswrapper[4713]: I0314 05:27:51.512661 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:51Z is after 2026-02-23T05:33:13Z Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.509577 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:27:52Z is after 2026-02-23T05:33:13Z Mar 14 05:27:52 crc kubenswrapper[4713]: E0314 05:27:52.880141 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.881131 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.882151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.882198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.882231 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:52 crc kubenswrapper[4713]: I0314 05:27:52.882262 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:52 crc kubenswrapper[4713]: E0314 05:27:52.886372 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 05:27:53 crc kubenswrapper[4713]: I0314 05:27:53.511796 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:54 crc kubenswrapper[4713]: I0314 05:27:54.512571 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:55 crc kubenswrapper[4713]: I0314 05:27:55.515368 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:56 crc kubenswrapper[4713]: I0314 05:27:56.513606 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.512194 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.563012 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.564901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.564962 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.564980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.565663 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:57 crc kubenswrapper[4713]: E0314 05:27:57.642770 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.793471 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.793596 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.793693 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.793969 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.796153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.796201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.796229 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.796739 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 05:27:57 crc kubenswrapper[4713]: I0314 05:27:57.796849 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac" gracePeriod=30 Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.404832 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406162 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406708 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac" exitCode=255 Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac"} Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0563ae22d9421b3f5da796c13d8cadc6514d52f1cfad39b4e2b36f6bb3ce460e"} Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406867 4713 scope.go:117] "RemoveContainer" containerID="65362674790f184b43d25af7ea21bb1608f5df3a1172a55542521e877b689f12" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.406966 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.408116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.408138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.408148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.408545 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.410509 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd"} Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.410641 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.411570 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.411589 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.411601 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:58 crc kubenswrapper[4713]: I0314 05:27:58.513880 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.417684 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.422783 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.423449 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.425850 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" exitCode=255 Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.425926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd"} Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.426001 4713 scope.go:117] "RemoveContainer" containerID="fb33e315bacc5a127ec5971c671d84d2ca4c82c2f6ef3ee8177027bf801e9fbe" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.426350 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.427727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.427769 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.427782 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.428686 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:27:59 crc kubenswrapper[4713]: E0314 05:27:59.434626 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.513748 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:27:59 crc kubenswrapper[4713]: W0314 05:27:59.790200 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 14 05:27:59 crc kubenswrapper[4713]: E0314 05:27:59.790280 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 05:27:59 crc kubenswrapper[4713]: E0314 05:27:59.886283 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.887378 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.889248 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.889342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.889366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:27:59 crc kubenswrapper[4713]: I0314 05:27:59.889402 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:27:59 crc kubenswrapper[4713]: E0314 05:27:59.899619 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.267507 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.268011 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.270052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.270126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.270164 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.431383 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 05:28:00 crc kubenswrapper[4713]: I0314 05:28:00.514360 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.476270 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25a8aeaaa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,LastTimestamp:2026-03-14 05:26:57.503832746 +0000 UTC m=+0.591742046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.482223 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.487347 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.498605 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.501798 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df2638a8b88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.654803336 +0000 UTC m=+0.742712636,LastTimestamp:2026-03-14 05:26:57.654803336 +0000 UTC m=+0.742712636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.507747 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.664272649 +0000 UTC m=+0.752181949,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: I0314 05:28:01.508542 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.512945 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.66430626 +0000 UTC m=+0.752215560,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.518342 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.664318681 +0000 UTC m=+0.752227981,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.523568 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.665917843 +0000 UTC m=+0.753827143,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.528274 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.665934264 +0000 UTC m=+0.753843564,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.533254 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.665942934 +0000 UTC m=+0.753852234,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.540447 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.666683454 +0000 UTC m=+0.754592754,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.544988 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.666700674 +0000 UTC m=+0.754609974,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.549298 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.666712405 +0000 UTC m=+0.754621705,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.554540 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.667300328 +0000 UTC m=+0.755209628,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.561363 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.667282398 +0000 UTC m=+0.755191708,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.567748 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.667324119 +0000 UTC m=+0.755233419,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.573921 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.66733585 +0000 UTC m=+0.755245150,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.580653 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.66735378 +0000 UTC m=+0.755263080,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.586366 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.667364821 +0000 UTC m=+0.755274121,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.593764 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.668239175 +0000 UTC m=+0.756148475,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.599713 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.668263915 +0000 UTC m=+0.756173215,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.603707 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d784283\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d784283 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552941699 +0000 UTC m=+0.640850999,LastTimestamp:2026-03-14 05:26:57.668279136 +0000 UTC m=+0.756188446,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.608899 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d77f5ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d77f5ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552922028 +0000 UTC m=+0.640831328,LastTimestamp:2026-03-14 05:26:57.668461033 +0000 UTC m=+0.756370333,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.613821 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c9df25d782073\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c9df25d782073 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:57.552932979 +0000 UTC m=+0.640842279,LastTimestamp:2026-03-14 05:26:57.668482404 +0000 UTC m=+0.756391704,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.620795 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df27bea9b00 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.063751936 +0000 UTC m=+1.151661236,LastTimestamp:2026-03-14 05:26:58.063751936 +0000 UTC m=+1.151661236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.626428 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df27c0eb5b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.066118069 +0000 UTC m=+1.154027359,LastTimestamp:2026-03-14 05:26:58.066118069 +0000 UTC m=+1.154027359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.631005 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df27c69c266 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.072085094 +0000 UTC m=+1.159994394,LastTimestamp:2026-03-14 05:26:58.072085094 +0000 UTC m=+1.159994394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.637538 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df27ca4755b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.075931995 +0000 UTC m=+1.163841295,LastTimestamp:2026-03-14 05:26:58.075931995 +0000 UTC m=+1.163841295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.642747 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df27d1cabb1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.083810225 +0000 UTC m=+1.171719525,LastTimestamp:2026-03-14 05:26:58.083810225 +0000 UTC m=+1.171719525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.649922 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df2a1776674 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.693736052 +0000 UTC m=+1.781645352,LastTimestamp:2026-03-14 05:26:58.693736052 +0000 UTC m=+1.781645352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.656095 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2a178b2bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.693821115 +0000 UTC m=+1.781730415,LastTimestamp:2026-03-14 05:26:58.693821115 +0000 UTC m=+1.781730415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.660879 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2a178dac6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.693831366 +0000 UTC m=+1.781740666,LastTimestamp:2026-03-14 05:26:58.693831366 +0000 UTC m=+1.781740666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.665073 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df2a17e3c04 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.69418394 +0000 UTC m=+1.782093240,LastTimestamp:2026-03-14 05:26:58.69418394 +0000 UTC m=+1.782093240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.671191 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2a184d615 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.694616597 +0000 UTC m=+1.782525897,LastTimestamp:2026-03-14 05:26:58.694616597 +0000 UTC m=+1.782525897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.675704 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2a220189b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.704791707 +0000 UTC m=+1.792701007,LastTimestamp:2026-03-14 05:26:58.704791707 +0000 UTC m=+1.792701007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.680472 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df2a239dc10 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.706480144 +0000 UTC m=+1.794389434,LastTimestamp:2026-03-14 05:26:58.706480144 +0000 UTC m=+1.794389434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.684449 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2a23ee4e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.706810086 +0000 UTC m=+1.794719386,LastTimestamp:2026-03-14 05:26:58.706810086 +0000 UTC m=+1.794719386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.689571 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df2a240e3b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.706940851 +0000 UTC m=+1.794850151,LastTimestamp:2026-03-14 05:26:58.706940851 +0000 UTC m=+1.794850151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.693895 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2a2441c75 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.707151989 +0000 UTC m=+1.795061289,LastTimestamp:2026-03-14 05:26:58.707151989 +0000 UTC m=+1.795061289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.697914 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2a251804a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.708029514 +0000 UTC m=+1.795938814,LastTimestamp:2026-03-14 05:26:58.708029514 +0000 UTC m=+1.795938814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.703534 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2b53df26c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.025515116 +0000 UTC m=+2.113424406,LastTimestamp:2026-03-14 05:26:59.025515116 +0000 UTC m=+2.113424406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.708667 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2b5c972c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.034657475 +0000 UTC m=+2.122566775,LastTimestamp:2026-03-14 05:26:59.034657475 +0000 UTC m=+2.122566775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.718492 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2b5d917a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.035682725 +0000 UTC m=+2.123592025,LastTimestamp:2026-03-14 05:26:59.035682725 +0000 UTC m=+2.123592025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.725936 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2bf86b766 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.198056294 +0000 UTC m=+2.285965614,LastTimestamp:2026-03-14 05:26:59.198056294 +0000 UTC m=+2.285965614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.731243 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2c051bff9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.211362297 +0000 UTC m=+2.299271607,LastTimestamp:2026-03-14 05:26:59.211362297 +0000 UTC m=+2.299271607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.735109 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2c062f8b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.212490932 +0000 UTC m=+2.300400252,LastTimestamp:2026-03-14 05:26:59.212490932 +0000 UTC m=+2.300400252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.740322 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2cb71439e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.39797699 +0000 UTC m=+2.485886290,LastTimestamp:2026-03-14 05:26:59.39797699 +0000 UTC m=+2.485886290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.745371 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2cc04efae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.40765483 +0000 UTC m=+2.495564130,LastTimestamp:2026-03-14 05:26:59.40765483 +0000 UTC m=+2.495564130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.750779 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2d64e88b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.580250288 +0000 UTC m=+2.668159588,LastTimestamp:2026-03-14 05:26:59.580250288 +0000 UTC m=+2.668159588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.755966 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df2d69eb197 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.585503639 +0000 UTC m=+2.673412939,LastTimestamp:2026-03-14 05:26:59.585503639 +0000 UTC m=+2.673412939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.760490 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df2d6f2e3d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.591021527 +0000 UTC m=+2.678930817,LastTimestamp:2026-03-14 05:26:59.591021527 +0000 UTC m=+2.678930817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.766897 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2d74e13f5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.596997621 +0000 UTC m=+2.684906921,LastTimestamp:2026-03-14 05:26:59.596997621 +0000 UTC m=+2.684906921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.773794 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2e37ce62e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.801392686 +0000 UTC m=+2.889301986,LastTimestamp:2026-03-14 05:26:59.801392686 +0000 UTC m=+2.889301986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.780367 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df2e3c8203a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.806322746 +0000 UTC m=+2.894232046,LastTimestamp:2026-03-14 05:26:59.806322746 +0000 UTC m=+2.894232046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.785101 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df2e404f466 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.810309222 +0000 UTC m=+2.898218522,LastTimestamp:2026-03-14 05:26:59.810309222 +0000 UTC m=+2.898218522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.790608 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2e40c62d8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.810796248 +0000 UTC m=+2.898705548,LastTimestamp:2026-03-14 05:26:59.810796248 +0000 UTC m=+2.898705548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.796189 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2e425e832 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.812468786 +0000 UTC m=+2.900378086,LastTimestamp:2026-03-14 05:26:59.812468786 +0000 UTC m=+2.900378086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.802195 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2e43d2fe4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.813994468 +0000 UTC m=+2.901903778,LastTimestamp:2026-03-14 05:26:59.813994468 +0000 UTC m=+2.901903778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.809047 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c9df2e525b82e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.82923371 +0000 UTC m=+2.917143010,LastTimestamp:2026-03-14 05:26:59.82923371 +0000 UTC m=+2.917143010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.812744 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2e572bbf9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.834280953 +0000 UTC m=+2.922190253,LastTimestamp:2026-03-14 05:26:59.834280953 +0000 UTC m=+2.922190253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.816388 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2e5812abe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.835226814 +0000 UTC m=+2.923136124,LastTimestamp:2026-03-14 05:26:59.835226814 +0000 UTC m=+2.923136124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.820733 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2f000b570 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.011357552 +0000 UTC m=+3.099266852,LastTimestamp:2026-03-14 05:27:00.011357552 +0000 UTC m=+3.099266852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.826896 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2f03413ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.014724077 +0000 UTC m=+3.102633377,LastTimestamp:2026-03-14 05:27:00.014724077 +0000 UTC m=+3.102633377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.832542 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2f10ce2d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.028932824 +0000 UTC m=+3.116842124,LastTimestamp:2026-03-14 05:27:00.028932824 +0000 UTC m=+3.116842124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.839299 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2f119fc33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.029791283 +0000 UTC m=+3.117700583,LastTimestamp:2026-03-14 05:27:00.029791283 +0000 UTC m=+3.117700583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.847300 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2f1394827 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.031842343 +0000 UTC m=+3.119751643,LastTimestamp:2026-03-14 05:27:00.031842343 +0000 UTC m=+3.119751643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.854930 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2f14c3976 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.033083766 +0000 UTC m=+3.120993066,LastTimestamp:2026-03-14 05:27:00.033083766 +0000 UTC m=+3.120993066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.862513 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2fddd3bf9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.243913721 +0000 UTC m=+3.331823031,LastTimestamp:2026-03-14 05:27:00.243913721 +0000 UTC m=+3.331823031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.869636 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2fe046e7f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.246482559 +0000 UTC m=+3.334391869,LastTimestamp:2026-03-14 05:27:00.246482559 +0000 UTC m=+3.334391869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.875437 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2feca410d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.259447053 +0000 UTC m=+3.347356373,LastTimestamp:2026-03-14 05:27:00.259447053 +0000 UTC m=+3.347356373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.882595 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df2fedebdaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.260789679 +0000 UTC m=+3.348698989,LastTimestamp:2026-03-14 05:27:00.260789679 +0000 UTC m=+3.348698989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.888107 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c9df2fef7f8b2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.262443186 +0000 UTC m=+3.350352496,LastTimestamp:2026-03-14 05:27:00.262443186 +0000 UTC m=+3.350352496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.896181 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df303169769 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.331558761 +0000 UTC m=+3.419468071,LastTimestamp:2026-03-14 05:27:00.331558761 +0000 UTC m=+3.419468071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.902496 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df309b79d71 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.442774897 +0000 UTC m=+3.530684197,LastTimestamp:2026-03-14 05:27:00.442774897 +0000 UTC m=+3.530684197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.909178 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df30a994551 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.457563473 +0000 UTC m=+3.545472773,LastTimestamp:2026-03-14 05:27:00.457563473 +0000 UTC m=+3.545472773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.913371 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df30aa63a4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.458412623 +0000 UTC m=+3.546321923,LastTimestamp:2026-03-14 05:27:00.458412623 +0000 UTC m=+3.546321923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.920549 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df313a4439c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.609278876 +0000 UTC m=+3.697188176,LastTimestamp:2026-03-14 05:27:00.609278876 +0000 UTC m=+3.697188176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.928327 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df316fea3b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.665533361 +0000 UTC m=+3.753442661,LastTimestamp:2026-03-14 05:27:00.665533361 +0000 UTC m=+3.753442661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.935540 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df3179ceeea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.675907306 +0000 UTC m=+3.763816606,LastTimestamp:2026-03-14 05:27:00.675907306 +0000 UTC m=+3.763816606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.940394 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df31ec821f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.796178932 +0000 UTC m=+3.884088232,LastTimestamp:2026-03-14 05:27:00.796178932 +0000 UTC m=+3.884088232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.944506 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df31f789670 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.807743088 +0000 UTC m=+3.895652388,LastTimestamp:2026-03-14 05:27:00.807743088 +0000 UTC m=+3.895652388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.949480 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df3501ea456 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:01.62393199 +0000 UTC m=+4.711841300,LastTimestamp:2026-03-14 05:27:01.62393199 +0000 UTC m=+4.711841300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.954110 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df35ad14a2d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:01.803412013 +0000 UTC m=+4.891321343,LastTimestamp:2026-03-14 05:27:01.803412013 +0000 UTC m=+4.891321343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.960490 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df35b964e33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:01.816323635 +0000 UTC m=+4.904232935,LastTimestamp:2026-03-14 05:27:01.816323635 +0000 UTC m=+4.904232935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.968102 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df35ba9b7aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:01.817595818 +0000 UTC m=+4.905505128,LastTimestamp:2026-03-14 05:27:01.817595818 +0000 UTC m=+4.905505128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.974791 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df369e44b8c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.056315788 +0000 UTC m=+5.144225098,LastTimestamp:2026-03-14 05:27:02.056315788 +0000 UTC m=+5.144225098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.981149 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df36a8ccb02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.067358466 +0000 UTC m=+5.155267766,LastTimestamp:2026-03-14 05:27:02.067358466 +0000 UTC m=+5.155267766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.987722 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df36a9e4a76 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.068505206 +0000 UTC m=+5.156414506,LastTimestamp:2026-03-14 05:27:02.068505206 +0000 UTC m=+5.156414506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:01 crc kubenswrapper[4713]: E0314 05:28:01.995265 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df374349cc4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.22935162 +0000 UTC m=+5.317260920,LastTimestamp:2026-03-14 05:27:02.22935162 +0000 UTC m=+5.317260920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.000395 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df3750c927c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.243504764 +0000 UTC m=+5.331414064,LastTimestamp:2026-03-14 05:27:02.243504764 +0000 UTC m=+5.331414064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.007455 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df3751c23a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.24452496 +0000 UTC m=+5.332434250,LastTimestamp:2026-03-14 05:27:02.24452496 +0000 UTC m=+5.332434250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.012381 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df380e1bee6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.442024678 +0000 UTC m=+5.529933978,LastTimestamp:2026-03-14 05:27:02.442024678 +0000 UTC m=+5.529933978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.018580 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df382607955 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.467107157 +0000 UTC m=+5.555016467,LastTimestamp:2026-03-14 05:27:02.467107157 +0000 UTC m=+5.555016467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.020529 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df382726331 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.468281137 +0000 UTC m=+5.556190437,LastTimestamp:2026-03-14 05:27:02.468281137 +0000 UTC m=+5.556190437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.028481 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df38bb6a0f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.623748337 +0000 UTC m=+5.711657627,LastTimestamp:2026-03-14 05:27:02.623748337 +0000 UTC m=+5.711657627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.033455 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c9df38c50e41a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:02.633858074 +0000 UTC m=+5.721767394,LastTimestamp:2026-03-14 05:27:02.633858074 +0000 UTC m=+5.721767394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.041976 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189c9df4bfd26135 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 05:28:02 crc kubenswrapper[4713]: body: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:07.792949557 +0000 UTC m=+10.880858897,LastTimestamp:2026-03-14 05:27:07.792949557 +0000 UTC m=+10.880858897,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.046575 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df4bfd3c1e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:07.79303984 +0000 UTC m=+10.880949170,LastTimestamp:2026-03-14 05:27:07.79303984 +0000 UTC m=+10.880949170,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.053737 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189c9df59a1ce5b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 05:28:02 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 05:28:02 crc kubenswrapper[4713]: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:11.455266225 +0000 UTC m=+14.543175525,LastTimestamp:2026-03-14 05:27:11.455266225 +0000 UTC m=+14.543175525,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.060870 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df59a1d82b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:11.455306417 +0000 UTC m=+14.543215707,LastTimestamp:2026-03-14 05:27:11.455306417 +0000 UTC m=+14.543215707,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.068016 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-apiserver-crc.189c9df59aae6555 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 05:28:02 crc kubenswrapper[4713]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 14 05:28:02 crc kubenswrapper[4713]: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:11.464801621 +0000 UTC m=+14.552710931,LastTimestamp:2026-03-14 05:27:11.464801621 +0000 UTC m=+14.552710931,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.074535 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c9df59a1d82b1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df59a1d82b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:11.455306417 +0000 UTC m=+14.543215707,LastTimestamp:2026-03-14 05:27:11.464844642 +0000 UTC m=+14.552753942,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.080306 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c9df30aa63a4f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df30aa63a4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.458412623 +0000 UTC m=+3.546321923,LastTimestamp:2026-03-14 05:27:11.673139611 +0000 UTC m=+14.761048911,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.086799 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c9df316fea3b1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df316fea3b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.665533361 +0000 UTC m=+3.753442661,LastTimestamp:2026-03-14 05:27:11.846182323 +0000 UTC m=+14.934091653,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.095740 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c9df3179ceeea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c9df3179ceeea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:00.675907306 +0000 UTC m=+3.763816606,LastTimestamp:2026-03-14 05:27:11.85397074 +0000 UTC m=+14.941880040,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.103642 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189c9df713ef28a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 05:28:02 crc kubenswrapper[4713]: body: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794056361 +0000 UTC m=+20.881965661,LastTimestamp:2026-03-14 05:27:17.794056361 +0000 UTC m=+20.881965661,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.111743 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df713f01aef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794118383 +0000 UTC m=+20.882027683,LastTimestamp:2026-03-14 05:27:17.794118383 +0000 UTC m=+20.882027683,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.119023 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df713ef28a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189c9df713ef28a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 05:28:02 crc kubenswrapper[4713]: body: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794056361 +0000 UTC m=+20.881965661,LastTimestamp:2026-03-14 05:27:27.793635319 +0000 UTC m=+30.881544619,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.126283 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df713f01aef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df713f01aef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794118383 +0000 UTC m=+20.882027683,LastTimestamp:2026-03-14 05:27:27.793690971 +0000 UTC m=+30.881600271,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.135346 4713 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df968150aba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:27.795759802 +0000 UTC m=+30.883669102,LastTimestamp:2026-03-14 05:27:27.795759802 +0000 UTC m=+30.883669102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.141083 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df2a251804a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2a251804a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:58.708029514 +0000 UTC m=+1.795938814,LastTimestamp:2026-03-14 05:27:27.912780525 +0000 UTC m=+31.000689825,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.149581 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df2b53df26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2b53df26c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.025515116 +0000 UTC m=+2.113424406,LastTimestamp:2026-03-14 05:27:28.053725648 +0000 UTC m=+31.141634948,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.154744 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df2b5c972c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df2b5c972c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:26:59.034657475 +0000 UTC m=+2.122566775,LastTimestamp:2026-03-14 05:27:28.063679433 +0000 UTC m=+31.151588733,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.163541 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df713ef28a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189c9df713ef28a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 05:28:02 crc kubenswrapper[4713]: body: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794056361 +0000 UTC m=+20.881965661,LastTimestamp:2026-03-14 05:27:37.792909803 +0000 UTC m=+40.880819103,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.168677 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df713f01aef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c9df713f01aef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794118383 +0000 UTC m=+20.882027683,LastTimestamp:2026-03-14 05:27:37.792953235 +0000 UTC m=+40.880862525,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:28:02 crc kubenswrapper[4713]: E0314 05:28:02.174889 4713 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c9df713ef28a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 05:28:02 crc kubenswrapper[4713]: &Event{ObjectMeta:{kube-controller-manager-crc.189c9df713ef28a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 05:28:02 crc kubenswrapper[4713]: body: Mar 14 05:28:02 crc kubenswrapper[4713]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:27:17.794056361 +0000 UTC m=+20.881965661,LastTimestamp:2026-03-14 05:27:47.794548988 +0000 UTC m=+50.882458328,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 05:28:02 crc kubenswrapper[4713]: > Mar 14 05:28:02 crc kubenswrapper[4713]: I0314 05:28:02.511854 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:03 crc kubenswrapper[4713]: I0314 05:28:03.510833 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.263151 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.263448 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.265171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.265432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.265525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.266233 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:28:04 crc kubenswrapper[4713]: E0314 05:28:04.266545 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.512356 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.793464 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.793709 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.794904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.795009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.795074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:04 crc kubenswrapper[4713]: I0314 05:28:04.799972 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.184072 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.184392 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.186355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.186397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.186411 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.187012 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:28:05 crc kubenswrapper[4713]: E0314 05:28:05.187222 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.447889 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.448778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.448808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.448819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:05 crc kubenswrapper[4713]: I0314 05:28:05.511831 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.511813 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:06 crc kubenswrapper[4713]: E0314 05:28:06.892571 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.900525 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.902082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.902127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.902139 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:06 crc kubenswrapper[4713]: I0314 05:28:06.902169 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:28:06 crc kubenswrapper[4713]: E0314 05:28:06.906667 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 05:28:07 crc kubenswrapper[4713]: I0314 05:28:07.511226 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:07 crc kubenswrapper[4713]: E0314 05:28:07.643451 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:28:08 crc kubenswrapper[4713]: I0314 05:28:08.464347 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 05:28:08 crc kubenswrapper[4713]: I0314 05:28:08.477640 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 05:28:08 crc kubenswrapper[4713]: I0314 05:28:08.510901 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:09 crc kubenswrapper[4713]: I0314 05:28:09.512779 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.270914 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.271056 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.274050 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.274085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.274095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:10 crc kubenswrapper[4713]: W0314 05:28:10.463508 4713 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:10 crc kubenswrapper[4713]: E0314 05:28:10.463554 4713 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 05:28:10 crc kubenswrapper[4713]: I0314 05:28:10.510678 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:11 crc kubenswrapper[4713]: I0314 05:28:11.511836 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:12 crc kubenswrapper[4713]: I0314 05:28:12.516157 4713 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.102333 4713 csr.go:261] certificate signing request csr-22hpv is approved, waiting to be issued Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.112735 4713 csr.go:257] certificate signing request csr-22hpv is issued Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.177122 4713 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.375501 4713 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.907452 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.908591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.908674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.908698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.908903 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.918301 4713 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.918587 4713 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 14 05:28:13 crc kubenswrapper[4713]: E0314 05:28:13.918619 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.921599 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.921657 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.921676 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.921701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.921723 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:13Z","lastTransitionTime":"2026-03-14T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:13 crc kubenswrapper[4713]: E0314 05:28:13.940543 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.949283 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.949550 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.949655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.949735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.949819 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:13Z","lastTransitionTime":"2026-03-14T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:13 crc kubenswrapper[4713]: E0314 05:28:13.964891 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.973607 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.973681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.973710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.973736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.973757 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:13Z","lastTransitionTime":"2026-03-14T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:13 crc kubenswrapper[4713]: E0314 05:28:13.984873 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.999099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.999145 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.999176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.999191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:13 crc kubenswrapper[4713]: I0314 05:28:13.999200 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:13Z","lastTransitionTime":"2026-03-14T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.010677 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.010847 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.010884 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.110982 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: I0314 05:28:14.114746 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 12:22:40.023052439 +0000 UTC Mar 14 05:28:14 crc kubenswrapper[4713]: I0314 05:28:14.114785 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6246h54m25.908270019s for next certificate rotation Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.211364 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.311916 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.412270 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.512724 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.613756 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.714065 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.814580 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:14 crc kubenswrapper[4713]: E0314 05:28:14.915770 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.016365 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: I0314 05:28:15.095704 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.116736 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.216912 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.317701 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.419003 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.519542 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.620460 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.721513 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.822990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:15 crc kubenswrapper[4713]: E0314 05:28:15.923792 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.024312 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.124774 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.225877 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.326834 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.427461 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.527898 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.628921 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.729902 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.830475 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:16 crc kubenswrapper[4713]: E0314 05:28:16.931056 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.031631 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.132015 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.233086 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.334002 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.435061 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.535650 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.636315 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.643702 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.736692 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.837719 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:17 crc kubenswrapper[4713]: E0314 05:28:17.938023 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.038119 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.139374 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.239719 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.340371 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.441372 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.542008 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.642642 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.743623 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.843712 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:18 crc kubenswrapper[4713]: E0314 05:28:18.944040 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.044853 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.145635 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.246822 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.347938 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.448473 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.549152 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: I0314 05:28:19.562917 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:19 crc kubenswrapper[4713]: I0314 05:28:19.563951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:19 crc kubenswrapper[4713]: I0314 05:28:19.563990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:19 crc kubenswrapper[4713]: I0314 05:28:19.564004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:19 crc kubenswrapper[4713]: I0314 05:28:19.564661 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.564829 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.649610 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.750316 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.850840 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:19 crc kubenswrapper[4713]: E0314 05:28:19.951608 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.053032 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.154325 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.254800 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.355636 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.456713 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.557792 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.658314 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: I0314 05:28:20.670406 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.758649 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.859750 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:20 crc kubenswrapper[4713]: E0314 05:28:20.960042 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.060814 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.161278 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.262138 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.362266 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.463105 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.563429 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: I0314 05:28:21.564276 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:21 crc kubenswrapper[4713]: I0314 05:28:21.566085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:21 crc kubenswrapper[4713]: I0314 05:28:21.566483 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:21 crc kubenswrapper[4713]: I0314 05:28:21.566695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.664165 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.764306 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.865125 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:21 crc kubenswrapper[4713]: E0314 05:28:21.966140 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.066589 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.167508 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.268770 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.370104 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.470637 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.571104 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.671275 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.771595 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.871944 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:22 crc kubenswrapper[4713]: E0314 05:28:22.972870 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.073500 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.174415 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.275350 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.376522 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.477409 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.577582 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.679001 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.780031 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.880474 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:23 crc kubenswrapper[4713]: E0314 05:28:23.981755 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.082631 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.183935 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.282398 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.291365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.291698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.291937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.292158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.292404 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:24Z","lastTransitionTime":"2026-03-14T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.312440 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.318724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.318941 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.319015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.319134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.319327 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:24Z","lastTransitionTime":"2026-03-14T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.346437 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.353332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.353413 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.353436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.353463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.353484 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:24Z","lastTransitionTime":"2026-03-14T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.367003 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.372766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.372878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.372906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.372937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:24 crc kubenswrapper[4713]: I0314 05:28:24.372959 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:24Z","lastTransitionTime":"2026-03-14T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.394190 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.394417 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.394470 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.495858 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.596220 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.696714 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.797539 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.898121 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:24 crc kubenswrapper[4713]: E0314 05:28:24.999439 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.101263 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.201688 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.302234 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.403634 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.504081 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.604892 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.706011 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.807473 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:25 crc kubenswrapper[4713]: E0314 05:28:25.908694 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.009871 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.111351 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.212842 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.312969 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.413407 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.514729 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.615285 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.716097 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.816847 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:26 crc kubenswrapper[4713]: E0314 05:28:26.917270 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.018290 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.118779 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.219934 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.320995 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.421841 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.523366 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.624517 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.644741 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.725641 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.826643 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:27 crc kubenswrapper[4713]: E0314 05:28:27.927278 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.027858 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.128696 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.229697 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.330810 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.431549 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.532280 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.633129 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.734270 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.835131 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:28 crc kubenswrapper[4713]: E0314 05:28:28.935716 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.036803 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.137937 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.238336 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.339193 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.440745 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.541856 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.643001 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.743777 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.844449 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:29 crc kubenswrapper[4713]: E0314 05:28:29.945313 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.046593 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.147887 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.248476 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.349493 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.450500 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.551064 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: I0314 05:28:30.563793 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:30 crc kubenswrapper[4713]: I0314 05:28:30.565912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:30 crc kubenswrapper[4713]: I0314 05:28:30.565980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:30 crc kubenswrapper[4713]: I0314 05:28:30.566004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.651618 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.752818 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.853963 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:30 crc kubenswrapper[4713]: E0314 05:28:30.954690 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.054870 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.155977 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.257143 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.358551 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.460169 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.560823 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.661602 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.762099 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.863122 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:31 crc kubenswrapper[4713]: E0314 05:28:31.963609 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.064088 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.164663 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.265015 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.365496 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.465893 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: I0314 05:28:32.562767 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 05:28:32 crc kubenswrapper[4713]: I0314 05:28:32.564090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:32 crc kubenswrapper[4713]: I0314 05:28:32.564115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:32 crc kubenswrapper[4713]: I0314 05:28:32.564125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:32 crc kubenswrapper[4713]: I0314 05:28:32.564678 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.564851 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.566410 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.667370 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.768504 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.869285 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:32 crc kubenswrapper[4713]: E0314 05:28:32.970041 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.071196 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.171742 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.272956 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.374422 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.475111 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.576642 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.677575 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.778556 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.879551 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:33 crc kubenswrapper[4713]: E0314 05:28:33.979682 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.079923 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.180806 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.281060 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.381305 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.482237 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.583084 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.683474 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.724375 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.731788 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.731828 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.731838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.731855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.731865 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:34Z","lastTransitionTime":"2026-03-14T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.741183 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.745753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.745797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.745813 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.745836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.745852 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:34Z","lastTransitionTime":"2026-03-14T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.760513 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.763541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.763760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.763915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.764062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.764198 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:34Z","lastTransitionTime":"2026-03-14T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.779512 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.782852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.782995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.783059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.783126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:34 crc kubenswrapper[4713]: I0314 05:28:34.783196 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:34Z","lastTransitionTime":"2026-03-14T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.796713 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.797194 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.797343 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.897536 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:34 crc kubenswrapper[4713]: E0314 05:28:34.998648 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.099537 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.200155 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.301581 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.402169 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.503264 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.604039 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.705167 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.806269 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:35 crc kubenswrapper[4713]: E0314 05:28:35.906871 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.007397 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.108476 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.208707 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.308990 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.409568 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.509705 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.610541 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.711107 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.812037 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:36 crc kubenswrapper[4713]: E0314 05:28:36.912745 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.013404 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.113565 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.214488 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.315536 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.416476 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.517352 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.617497 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.645080 4713 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.717938 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.818893 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:37 crc kubenswrapper[4713]: E0314 05:28:37.919740 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.020767 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.121128 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.221444 4713 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.302270 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.325199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.325286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.325303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.325329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.325345 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.427884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.427968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.427996 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.428041 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.428065 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.531372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.531486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.531497 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.531512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.531524 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.546080 4713 apiserver.go:52] "Watching apiserver" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.553792 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.554332 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.554921 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.555353 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.555529 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.555519 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.555877 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.555988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.556060 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.556090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.556599 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561028 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561079 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561137 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561137 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561265 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561627 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561724 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.561933 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.563620 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.601767 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.612000 4713 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.621465 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.632759 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.635517 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.635587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.635605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.635644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.635660 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638604 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638654 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638683 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638726 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638751 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638774 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638796 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638819 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638841 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638865 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638889 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638910 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638933 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638957 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638977 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.638998 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639061 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639081 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639100 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639143 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639172 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639198 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639241 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639265 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639290 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639313 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639336 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639359 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639382 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639406 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639434 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639459 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639485 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639508 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639530 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639573 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639598 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639622 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639642 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639666 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639664 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639687 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639841 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639929 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.639965 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640000 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640018 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640074 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640112 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640182 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640407 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640439 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640516 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640590 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640670 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640702 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640738 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640780 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640816 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640850 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640885 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640887 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640920 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640958 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.640994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641069 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641105 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641143 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641229 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641192 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641258 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641385 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641436 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641614 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641700 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641790 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641873 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642000 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642116 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642255 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642348 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642399 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642489 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642673 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642763 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642852 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642978 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643707 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643801 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643889 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644091 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644280 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644520 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644607 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644720 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644792 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644835 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644952 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645146 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645183 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645273 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645396 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645448 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645471 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645522 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645544 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645567 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645654 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645675 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645743 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645780 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645808 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645832 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645855 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645880 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645903 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645925 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645975 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.645998 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646023 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646048 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646072 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646096 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646189 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646239 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646272 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646346 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646439 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646469 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646494 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646519 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646592 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646616 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646642 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646689 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646741 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646766 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646790 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646816 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646848 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646887 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646981 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647052 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647076 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647172 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647198 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647254 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647281 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647331 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647356 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647379 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647428 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647480 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647502 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647527 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647605 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647634 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647662 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647689 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647715 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647739 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647866 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.647986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.648023 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.648079 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.648095 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.648110 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.648124 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.656525 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.667099 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.641703 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642046 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642638 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.642740 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643453 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643589 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643605 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643717 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.643698 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644727 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.644797 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.646635 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.650156 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.650590 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.651082 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.651440 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.651484 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.651795 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.652106 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.652349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.652609 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.653079 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.653549 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.653983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654382 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654371 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654463 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654652 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654762 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654790 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.654939 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655131 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655289 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655660 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655677 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655814 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655790 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.655885 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.656173 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.656267 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.656460 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.657551 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.657604 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.657721 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658118 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658424 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658433 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658730 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670772 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.658817 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659175 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659232 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659289 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659351 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659415 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.659438 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660155 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660192 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660301 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660499 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660118 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.660953 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.661156 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.661583 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.661710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.661777 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.662139 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.662298 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.662417 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.662728 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.662696 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.663390 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.663625 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.663994 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.664072 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.664330 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665153 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665164 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665186 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665356 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665546 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665612 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665684 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665785 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.665972 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666340 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666518 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666679 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.666743 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.667020 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.667067 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.667641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668336 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668516 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668757 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.668953 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669019 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669196 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669822 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.664315 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669914 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669933 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670278 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670566 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670638 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.664110 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.670724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.671275 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.671365 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.671551 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.671591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.671394 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.671972 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.672045 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.672097 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.672123 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.672123 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.672514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.672914 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.673131 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.673334 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.673785 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:39.173519493 +0000 UTC m=+102.261428833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.673611 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.673988 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:39.173887105 +0000 UTC m=+102.261796445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.669768 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.675713 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.682512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.682856 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.682875 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683035 4713 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683164 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683774 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683924 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.683927 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.684284 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.684423 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.684670 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.684698 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.684719 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.684793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.684808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.685251 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.685288 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.685501 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.685787 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.686447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.686919 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.687278 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.674157 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:28:39.174082401 +0000 UTC m=+102.261991741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.687567 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:39.187539734 +0000 UTC m=+102.275449204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:38 crc kubenswrapper[4713]: E0314 05:28:38.687601 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:39.187590436 +0000 UTC m=+102.275499986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.688038 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.688992 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.689154 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.689268 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.689357 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.689380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.689993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.690025 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.690262 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.690725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.690818 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.691563 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.691621 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.692314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.692722 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.692734 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.693093 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.693313 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.693307 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.678058 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.678173 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.693488 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.694049 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.694290 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.699356 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.699943 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.700592 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702452 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702625 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702844 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.702969 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.703766 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.706558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.709896 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.710322 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.711522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.713543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.723162 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.723308 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.733338 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.735657 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.738510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.738551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.738565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.738584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.738599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.748949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.748989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749051 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749064 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749073 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749082 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749091 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749100 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749187 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749199 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749231 4713 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749242 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749251 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749261 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749271 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749282 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749294 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749304 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749313 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749350 4713 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749360 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749370 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749380 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749388 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749396 4713 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749405 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749414 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749424 4713 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749432 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749441 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749449 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749458 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749466 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749474 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749482 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749490 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749499 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749507 4713 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749515 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749523 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749531 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749541 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749549 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749558 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749567 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749578 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749587 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749598 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749607 4713 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749618 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749634 4713 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749651 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749662 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749671 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749679 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749687 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749696 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749706 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749716 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749726 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749735 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749745 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749754 4713 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749764 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749773 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749780 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749788 4713 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749796 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749804 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749812 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749820 4713 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749828 4713 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749836 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749845 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749852 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749860 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749869 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749877 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749886 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749893 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749901 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749909 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749918 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749926 4713 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749934 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749943 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749950 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749959 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749968 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749976 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749985 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.749993 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750001 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750010 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750018 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750025 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750033 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750042 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750050 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750058 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750067 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750075 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750085 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750093 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750101 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750109 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750117 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750126 4713 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750134 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750142 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750150 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750158 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750167 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750176 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750185 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750193 4713 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750295 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750304 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750313 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750321 4713 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750329 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750337 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750345 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750353 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750361 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750369 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750378 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750387 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750395 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750404 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750412 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750421 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750429 4713 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750437 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750445 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750452 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750460 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750472 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750481 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750489 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750497 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750505 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750513 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750521 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750529 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750539 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750547 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750555 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750563 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750570 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750578 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750588 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750597 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750605 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750614 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750622 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750630 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750638 4713 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750646 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750653 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750662 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750670 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750678 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750687 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750695 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750703 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750712 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750721 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750730 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750739 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750747 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750756 4713 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750764 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750772 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750780 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750788 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750796 4713 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750804 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750812 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750820 4713 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750828 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750836 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750844 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750853 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750861 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750870 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750878 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750888 4713 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750896 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750904 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750912 4713 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.750919 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.841469 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.841515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.841536 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.841560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.841576 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.876087 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.885289 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.894856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 05:28:38 crc kubenswrapper[4713]: W0314 05:28:38.912070 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-00037632ca39ba4a4f946783d0a8457977d333c52d92a6a8ecdcb23e8757c6a5 WatchSource:0}: Error finding container 00037632ca39ba4a4f946783d0a8457977d333c52d92a6a8ecdcb23e8757c6a5: Status 404 returned error can't find the container with id 00037632ca39ba4a4f946783d0a8457977d333c52d92a6a8ecdcb23e8757c6a5 Mar 14 05:28:38 crc kubenswrapper[4713]: W0314 05:28:38.912368 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e9f8d13aabc501f54727c413ac6f96fdf584e109472441109b732f646dc2a036 WatchSource:0}: Error finding container e9f8d13aabc501f54727c413ac6f96fdf584e109472441109b732f646dc2a036: Status 404 returned error can't find the container with id e9f8d13aabc501f54727c413ac6f96fdf584e109472441109b732f646dc2a036 Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.946406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.946449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.946463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.946482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:38 crc kubenswrapper[4713]: I0314 05:28:38.946492 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:38Z","lastTransitionTime":"2026-03-14T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.049299 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.049344 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.049357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.049376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.049390 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.152366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.152683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.152697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.152716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.152732 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255150 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255744 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255795 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.255875 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:28:40.255848031 +0000 UTC m=+103.343757331 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.255920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.255932 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.255947 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.255957 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.255991 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:40.255979155 +0000 UTC m=+103.343888455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256046 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256060 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256070 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256103 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:40.256095809 +0000 UTC m=+103.344005109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256142 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256167 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:40.256160931 +0000 UTC m=+103.344070231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256255 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: E0314 05:28:39.256307 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:40.256295805 +0000 UTC m=+103.344205175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.357332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.357369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.357380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.357394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.357405 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.459403 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.459444 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.459457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.459477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.459488 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.536646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.536707 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.536727 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e9f8d13aabc501f54727c413ac6f96fdf584e109472441109b732f646dc2a036"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.538304 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.538336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6052c3c0f8823356623b7fee4cb4b18df91f4dcceece61942207c70186a5c8ca"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.539925 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"00037632ca39ba4a4f946783d0a8457977d333c52d92a6a8ecdcb23e8757c6a5"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.554073 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.562684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.562738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.562756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.562779 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.562796 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.568796 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.569949 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.571047 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.572504 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.573886 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.575925 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.577009 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.578234 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.580284 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.581742 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.583622 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.584466 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.585691 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.586265 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.586863 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.587868 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.588423 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.588493 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.589433 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.589827 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.590461 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.591640 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.592331 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.593789 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.594607 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.596067 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.596839 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.597810 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.599301 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.599918 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.601179 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.601892 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.603246 4713 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.603431 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.606306 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.608011 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.609299 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.612543 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.613004 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.614004 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.616632 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.617576 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.619333 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.619865 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.620869 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.621668 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.622915 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.623464 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.624038 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.624888 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.625658 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.626162 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.626703 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.627260 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.627784 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.628383 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.628869 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.635444 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.655628 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.664845 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.664886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.664901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.664922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.664938 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.671676 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.684964 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.697721 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.713225 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.731783 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.748012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.772026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.772080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.772093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.772113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.772126 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.874328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.874386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.874403 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.874427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.874443 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978346 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h4rjf"] Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978544 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978606 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978629 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:39Z","lastTransitionTime":"2026-03-14T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.978722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.983182 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.983434 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.984043 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 05:28:39 crc kubenswrapper[4713]: I0314 05:28:39.997129 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:39Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.010713 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.026656 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.041123 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.059984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.062943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2bzw\" (UniqueName: \"kubernetes.io/projected/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-kube-api-access-g2bzw\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.063195 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-hosts-file\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.076852 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.081574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.081659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.081679 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.081714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.081735 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.094469 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.164532 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2bzw\" (UniqueName: \"kubernetes.io/projected/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-kube-api-access-g2bzw\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.164600 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-hosts-file\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.164728 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-hosts-file\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.188484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.188542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.188555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.188575 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.188590 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.206632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2bzw\" (UniqueName: \"kubernetes.io/projected/4f4a76d9-a890-4a25-bd97-411e6a8a9bdd-kube-api-access-g2bzw\") pod \"node-resolver-h4rjf\" (UID: \"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\") " pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.266500 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.266621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.266650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.266673 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.266694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.266850 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.266876 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.266891 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.266956 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:42.266938146 +0000 UTC m=+105.354847446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267023 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:28:42.267016759 +0000 UTC m=+105.354926058 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267078 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267103 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:42.267097621 +0000 UTC m=+105.355006921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267380 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267409 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267544 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267550 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:42.267516175 +0000 UTC m=+105.355425475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267557 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.267591 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:42.267583147 +0000 UTC m=+105.355492447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291133 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.291981 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h4rjf" Mar 14 05:28:40 crc kubenswrapper[4713]: W0314 05:28:40.306249 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f4a76d9_a890_4a25_bd97_411e6a8a9bdd.slice/crio-9de5b02d95cd23a58dfc02a2f9978aa5bc8f55a35e20f2c3573350f705658610 WatchSource:0}: Error finding container 9de5b02d95cd23a58dfc02a2f9978aa5bc8f55a35e20f2c3573350f705658610: Status 404 returned error can't find the container with id 9de5b02d95cd23a58dfc02a2f9978aa5bc8f55a35e20f2c3573350f705658610 Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.389357 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5l5jq"] Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.389965 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.393336 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395393 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-sx769"] Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395941 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.395956 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.396161 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.396804 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ls8z5"] Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.397073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.397181 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.399348 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.399558 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.399712 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.400049 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.400790 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.400923 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.401038 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.401157 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.401289 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.413469 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.424981 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.441707 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.456235 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469247 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-cnibin\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-multus\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469312 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-system-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-kubelet\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469343 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cc7fbb-a88a-4b94-89bb-1323e0751467-proxy-tls\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469365 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-netns\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469381 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-multus-daemon-config\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469396 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfbm\" (UniqueName: \"kubernetes.io/projected/703b6542-1a83-442a-9673-6a774399dd7e-kube-api-access-pnfbm\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469427 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c6cc7fbb-a88a-4b94-89bb-1323e0751467-rootfs\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469440 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-conf-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469454 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-multus-certs\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-binary-copy\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469483 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5t9\" (UniqueName: \"kubernetes.io/projected/8308bcb2-29df-4a11-86f3-b031e612b314-kube-api-access-5c5t9\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469513 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24t7\" (UniqueName: \"kubernetes.io/projected/c6cc7fbb-a88a-4b94-89bb-1323e0751467-kube-api-access-v24t7\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469529 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6cc7fbb-a88a-4b94-89bb-1323e0751467-mcd-auth-proxy-config\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469545 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-bin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-cnibin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469575 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-system-cni-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469595 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-os-release\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469609 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-cni-binary-copy\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-os-release\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469645 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-socket-dir-parent\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-k8s-cni-cncf-io\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-hostroot\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.469699 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-etc-kubernetes\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.476059 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.487106 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.498708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.498757 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.498771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.498789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.498801 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.501787 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.517272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.535751 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.543963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h4rjf" event={"ID":"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd","Type":"ContainerStarted","Data":"9de5b02d95cd23a58dfc02a2f9978aa5bc8f55a35e20f2c3573350f705658610"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.557509 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.562703 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.562814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.562810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.562998 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.563142 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:40 crc kubenswrapper[4713]: E0314 05:28:40.563327 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570416 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-bin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570483 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6cc7fbb-a88a-4b94-89bb-1323e0751467-mcd-auth-proxy-config\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-system-cni-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-cnibin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570579 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-system-cni-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-os-release\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570637 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-cni-binary-copy\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-os-release\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570684 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-os-release\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570708 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-k8s-cni-cncf-io\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570686 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-k8s-cni-cncf-io\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-socket-dir-parent\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570816 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-hostroot\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570847 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-etc-kubernetes\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570908 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-cnibin\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570944 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570979 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-system-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571013 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-multus\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cc7fbb-a88a-4b94-89bb-1323e0751467-proxy-tls\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571089 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-kubelet\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571122 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-multus-daemon-config\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571152 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfbm\" (UniqueName: \"kubernetes.io/projected/703b6542-1a83-442a-9673-6a774399dd7e-kube-api-access-pnfbm\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571236 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-netns\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c6cc7fbb-a88a-4b94-89bb-1323e0751467-rootfs\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-conf-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6cc7fbb-a88a-4b94-89bb-1323e0751467-mcd-auth-proxy-config\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-multus-certs\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571358 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-os-release\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-binary-copy\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571379 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-multus-certs\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571389 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-multus\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571402 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-cnibin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571422 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5t9\" (UniqueName: \"kubernetes.io/projected/8308bcb2-29df-4a11-86f3-b031e612b314-kube-api-access-5c5t9\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24t7\" (UniqueName: \"kubernetes.io/projected/c6cc7fbb-a88a-4b94-89bb-1323e0751467-kube-api-access-v24t7\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-cni-binary-copy\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.570550 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-cni-bin\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-socket-dir-parent\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571843 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-hostroot\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571863 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-etc-kubernetes\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.571884 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-cnibin\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-system-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572176 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-binary-copy\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-cni-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-var-lib-kubelet\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572459 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c6cc7fbb-a88a-4b94-89bb-1323e0751467-rootfs\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-host-run-netns\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/703b6542-1a83-442a-9673-6a774399dd7e-multus-conf-dir\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.572963 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8308bcb2-29df-4a11-86f3-b031e612b314-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.573003 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/703b6542-1a83-442a-9673-6a774399dd7e-multus-daemon-config\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.574061 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8308bcb2-29df-4a11-86f3-b031e612b314-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.575031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.575799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cc7fbb-a88a-4b94-89bb-1323e0751467-proxy-tls\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.591553 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.595935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24t7\" (UniqueName: \"kubernetes.io/projected/c6cc7fbb-a88a-4b94-89bb-1323e0751467-kube-api-access-v24t7\") pod \"machine-config-daemon-ls8z5\" (UID: \"c6cc7fbb-a88a-4b94-89bb-1323e0751467\") " pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.596197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5t9\" (UniqueName: \"kubernetes.io/projected/8308bcb2-29df-4a11-86f3-b031e612b314-kube-api-access-5c5t9\") pod \"multus-additional-cni-plugins-sx769\" (UID: \"8308bcb2-29df-4a11-86f3-b031e612b314\") " pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.601047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.601089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.601099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.601116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.601127 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.602500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfbm\" (UniqueName: \"kubernetes.io/projected/703b6542-1a83-442a-9673-6a774399dd7e-kube-api-access-pnfbm\") pod \"multus-5l5jq\" (UID: \"703b6542-1a83-442a-9673-6a774399dd7e\") " pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.603319 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.611821 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.623745 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.634731 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.647083 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.672037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.687131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.704172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.704389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.704485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.704562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.704627 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.713673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5l5jq" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.722792 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sx769" Mar 14 05:28:40 crc kubenswrapper[4713]: W0314 05:28:40.726717 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703b6542_1a83_442a_9673_6a774399dd7e.slice/crio-0d14d3c72757d0d884b410e35f8e478f204a653e9c160dcd1724343d0543489d WatchSource:0}: Error finding container 0d14d3c72757d0d884b410e35f8e478f204a653e9c160dcd1724343d0543489d: Status 404 returned error can't find the container with id 0d14d3c72757d0d884b410e35f8e478f204a653e9c160dcd1724343d0543489d Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.730612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:28:40 crc kubenswrapper[4713]: W0314 05:28:40.762872 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6cc7fbb_a88a_4b94_89bb_1323e0751467.slice/crio-9b36336335e3767f9703a55dadac5dbd724568092cc733d740e6f9b9ef0dc539 WatchSource:0}: Error finding container 9b36336335e3767f9703a55dadac5dbd724568092cc733d740e6f9b9ef0dc539: Status 404 returned error can't find the container with id 9b36336335e3767f9703a55dadac5dbd724568092cc733d740e6f9b9ef0dc539 Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.807689 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.807714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.807722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.807736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.807744 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.810916 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4ds64"] Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.812578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.814157 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.814229 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.814264 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.815155 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.815413 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.815540 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.816113 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.831196 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.842819 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.855443 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.870892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.876928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.876971 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.876995 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877034 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877049 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877067 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tztwk\" (UniqueName: \"kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877255 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877410 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877570 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877655 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877863 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877935 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.877980 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.878027 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.878054 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.878079 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.878137 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.887486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.905963 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.920716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.920823 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.920879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.920934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.920989 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:40Z","lastTransitionTime":"2026-03-14T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.922523 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.935769 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.955225 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.969615 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979250 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979385 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979429 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979548 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979570 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979656 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979697 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979707 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979797 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979775 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979707 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979831 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979838 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979695 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979858 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979890 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979967 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980007 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980018 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980065 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980070 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980101 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tztwk\" (UniqueName: \"kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980163 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980178 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.979745 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980254 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.980970 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.985287 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:40 crc kubenswrapper[4713]: I0314 05:28:40.987145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.001233 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tztwk\" (UniqueName: \"kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk\") pod \"ovnkube-node-4ds64\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.023666 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.023726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.023743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.023766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.023783 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.126291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.126329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.126337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.126353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.126362 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.139027 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.229007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.229034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.229070 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.229085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.229094 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.331345 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.331577 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.331585 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.331600 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.331609 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.434450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.434484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.434492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.434508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.434519 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.537415 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.537487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.537511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.537538 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.537556 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.548574 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerStarted","Data":"ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.548610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerStarted","Data":"0d14d3c72757d0d884b410e35f8e478f204a653e9c160dcd1724343d0543489d"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.549692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.550969 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h4rjf" event={"ID":"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd","Type":"ContainerStarted","Data":"d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.552328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.552362 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.552376 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"9b36336335e3767f9703a55dadac5dbd724568092cc733d740e6f9b9ef0dc539"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.554098 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e" exitCode=0 Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.554149 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.554249 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerStarted","Data":"f68824b5c127b4ae408408dc083a88c6517ea2675ef3d66ed45e8e71f99f3c68"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.555253 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c" exitCode=0 Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.555283 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.555327 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"f92c9f70bf67c34a4db97cec7349813e72f81c1680938ea3ea88419f027eb41b"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.575508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.589963 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.606299 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.622962 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.634862 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.639154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.639184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.639196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.639257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.639269 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.645454 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.655218 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.671944 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.682971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.698769 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.717386 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.730012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742440 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.742564 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.753988 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.767458 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.781877 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.794045 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.804547 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.818220 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.834714 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.844476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.844502 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.844512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.844529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.844539 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.850122 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.872533 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.946653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.946691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.946700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.946713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:41 crc kubenswrapper[4713]: I0314 05:28:41.946723 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:41Z","lastTransitionTime":"2026-03-14T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.051829 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.051889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.051899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.051921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.051937 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.154951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.155808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.155866 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.155895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.155909 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.258032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.258097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.258115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.258148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.258171 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.294464 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.294570 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294618 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:28:46.294592187 +0000 UTC m=+109.382501487 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294645 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.294675 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294701 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:46.29468563 +0000 UTC m=+109.382594940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.294718 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.294768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294868 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294865 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294880 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294911 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:46.294903547 +0000 UTC m=+109.382812847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294916 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294929 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294959 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:46.294950319 +0000 UTC m=+109.382859619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294896 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.294984 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.295044 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:46.295021561 +0000 UTC m=+109.382930861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.360302 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.360330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.360339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.360352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.360360 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.465904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.466284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.466295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.466308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.466318 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.562620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.562653 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.562628 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.562780 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.562852 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:42 crc kubenswrapper[4713]: E0314 05:28:42.562920 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.567783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.567823 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.567833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.567849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.567863 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.568652 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.568694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.568707 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.568719 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.571356 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerStarted","Data":"48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.586995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.600828 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.615421 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.629255 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.641601 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.654148 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.669269 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.670785 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.670815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.670824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.670838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.670848 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.683074 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.696070 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.708891 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.724435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.773708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.773749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.773759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.773774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.773783 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.876772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.876822 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.876836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.876853 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.876865 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.979754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.979791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.979800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.979814 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:42 crc kubenswrapper[4713]: I0314 05:28:42.979824 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:42Z","lastTransitionTime":"2026-03-14T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.082406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.082436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.082443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.082455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.082463 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.184884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.184933 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.184948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.184966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.184981 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.287919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.287980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.287998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.288024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.288065 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.390833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.390892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.390943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.390966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.390981 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.494132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.494255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.494281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.494310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.494329 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.581491 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.581575 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.583657 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3" exitCode=0 Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.583735 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.586051 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.587388 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.608397 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.613559 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.613597 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.613611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.613629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.613641 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.629455 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.649534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.667970 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.682518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.700780 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.716549 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.716706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.717075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.717098 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.717158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.717176 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.735310 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.755490 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.771922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.784635 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.821644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.821717 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.821732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.821748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.821761 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.924345 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.924379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.924386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.924400 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:43 crc kubenswrapper[4713]: I0314 05:28:43.924409 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:43Z","lastTransitionTime":"2026-03-14T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.027275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.027339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.027349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.027366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.027438 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.129266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.129300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.129308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.129323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.129331 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.231437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.231703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.231722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.231738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.231749 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.334217 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.334259 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.334272 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.334289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.334302 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.436903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.436953 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.436968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.436987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.437000 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.539808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.539885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.539908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.539934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.539953 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.563379 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.563437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.563498 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.563613 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.563878 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.564090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.591252 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638" exitCode=0 Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.591351 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.594740 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.603037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.604428 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.621329 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.643803 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.643867 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.643879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.643905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.643917 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.645593 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.659951 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.673881 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.684908 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.699453 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.714080 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.732182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.745648 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.745684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.745951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.745979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.746293 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.746828 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.780899 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.798273 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.812995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.829968 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.843758 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.849790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.849820 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.849832 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.849849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.849860 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.857180 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.860584 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.866630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.866680 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.866699 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.866718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.866732 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.873930 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.878935 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.882912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.882943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.882955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.882971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.882982 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.886035 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.893806 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.896898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.896938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.896949 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.896994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.897007 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.900784 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.908519 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.911953 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.912044 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.912061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.912108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.912122 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.916111 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.927536 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.928699 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.931524 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.931566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.931576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.931591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.931600 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.942131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.943719 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: E0314 05:28:44.943877 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.952684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.952721 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.952731 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.952746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.952755 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:44Z","lastTransitionTime":"2026-03-14T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.955431 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.970391 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:44 crc kubenswrapper[4713]: I0314 05:28:44.996238 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.055255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.055289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.055297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.055312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.055320 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.157569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.157619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.157632 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.157650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.157664 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.260710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.260770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.260787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.260812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.260835 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.363911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.363977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.363994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.364018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.364035 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.466736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.466799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.466817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.466840 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.466859 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.570036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.570109 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.570131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.570157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.570179 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.619617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.625269 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6" exitCode=0 Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.627126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.652952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.668817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.673182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.673236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.673251 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.673266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.673275 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.682753 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.694228 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.704156 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.717139 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.728966 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.746573 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.766653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.776742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.776776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.776786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.776806 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.776818 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.782830 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.803449 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.813000 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.878842 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.878891 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.878905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.878923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.878936 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.981200 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.981252 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.981262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.981277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:45 crc kubenswrapper[4713]: I0314 05:28:45.981290 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:45Z","lastTransitionTime":"2026-03-14T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.083671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.083710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.083722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.083739 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.083752 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.186835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.186906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.186935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.186959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.186977 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.290977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.291027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.291043 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.291068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.291087 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.333274 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.333395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.333423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.333455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333547 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333551 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.333506581 +0000 UTC m=+117.421415911 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333604 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.333587934 +0000 UTC m=+117.421497244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333612 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333635 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333651 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333689 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333726 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.333727 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333741 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333832 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333777 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.333744588 +0000 UTC m=+117.421653898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333870 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.333855192 +0000 UTC m=+117.421764532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.333895 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.333882223 +0000 UTC m=+117.421791553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.394047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.394080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.394090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.394105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.394116 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.496418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.496459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.496467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.496484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.496493 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.563603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.563754 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.564107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.564177 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.564245 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:46 crc kubenswrapper[4713]: E0314 05:28:46.564302 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.598479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.598529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.598548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.598571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.598585 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.631384 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d" exitCode=0 Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.631463 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.657246 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.674747 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.690466 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.703110 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.705721 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.705778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.705792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.705815 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.705831 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.720648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.734128 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.749110 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.762479 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.774880 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5lt5l"] Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.775511 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.778758 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.778739 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.779000 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.779063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.780854 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.791877 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.806886 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.807873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.807908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.807919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.807934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.807945 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.819850 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.830537 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.839403 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.839763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81c9353e-167d-4f25-9c45-3649456e4263-serviceca\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.839894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81c9353e-167d-4f25-9c45-3649456e4263-host\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.839983 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6f5\" (UniqueName: \"kubernetes.io/projected/81c9353e-167d-4f25-9c45-3649456e4263-kube-api-access-hm6f5\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.850410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.860796 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.871580 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.879859 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.889907 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.898847 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.909892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.909913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.909921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.909942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.909951 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:46Z","lastTransitionTime":"2026-03-14T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.910404 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.925742 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.939273 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.940763 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81c9353e-167d-4f25-9c45-3649456e4263-serviceca\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.940795 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81c9353e-167d-4f25-9c45-3649456e4263-host\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.940818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6f5\" (UniqueName: \"kubernetes.io/projected/81c9353e-167d-4f25-9c45-3649456e4263-kube-api-access-hm6f5\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.941176 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81c9353e-167d-4f25-9c45-3649456e4263-host\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.941752 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81c9353e-167d-4f25-9c45-3649456e4263-serviceca\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.949148 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.958512 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6f5\" (UniqueName: \"kubernetes.io/projected/81c9353e-167d-4f25-9c45-3649456e4263-kube-api-access-hm6f5\") pod \"node-ca-5lt5l\" (UID: \"81c9353e-167d-4f25-9c45-3649456e4263\") " pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:46 crc kubenswrapper[4713]: I0314 05:28:46.959358 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.012937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.012969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.012977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.012990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.012999 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.100739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lt5l" Mar 14 05:28:47 crc kubenswrapper[4713]: W0314 05:28:47.112007 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c9353e_167d_4f25_9c45_3649456e4263.slice/crio-e32f0a2cbe72435be94720df5cc87950336b247d8208f85c89153fc1ed318f14 WatchSource:0}: Error finding container e32f0a2cbe72435be94720df5cc87950336b247d8208f85c89153fc1ed318f14: Status 404 returned error can't find the container with id e32f0a2cbe72435be94720df5cc87950336b247d8208f85c89153fc1ed318f14 Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.114849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.114881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.114889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.114902 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.114910 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.231572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.231609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.232069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.232093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.232109 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.334926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.334964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.334975 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.334991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.335003 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.437888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.437932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.437944 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.437961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.437973 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.539895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.539927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.539935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.539948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.539956 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.586666 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.601518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.624892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.637045 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lt5l" event={"ID":"81c9353e-167d-4f25-9c45-3649456e4263","Type":"ContainerStarted","Data":"f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.637101 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lt5l" event={"ID":"81c9353e-167d-4f25-9c45-3649456e4263","Type":"ContainerStarted","Data":"e32f0a2cbe72435be94720df5cc87950336b247d8208f85c89153fc1ed318f14"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.642053 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.642089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.642100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.642117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.642130 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.643014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.643958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.644218 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.644245 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.649934 4713 generic.go:334] "Generic (PLEG): container finished" podID="8308bcb2-29df-4a11-86f3-b031e612b314" containerID="394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba" exitCode=0 Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.649968 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerDied","Data":"394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.661689 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.680182 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.683768 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.693445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.704905 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.714495 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.723936 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.737931 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.745777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.745835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.745847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.745869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.745882 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.752605 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.768064 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.786829 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.799260 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.812564 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.822304 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.831672 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.842971 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.848269 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.848302 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.848312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.848329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.848341 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.851409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.865569 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.877741 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.890676 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.903440 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.915041 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.927495 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.952462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.952503 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.952514 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.952530 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:47 crc kubenswrapper[4713]: I0314 05:28:47.952539 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:47Z","lastTransitionTime":"2026-03-14T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.055316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.055364 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.055373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.055391 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.055445 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.158124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.158171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.158183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.158218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.158232 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.261016 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.261065 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.261076 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.261090 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.261102 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.363574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.363663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.363688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.363716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.363733 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.466678 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.466745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.466764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.466788 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.466807 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.563183 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.563270 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:48 crc kubenswrapper[4713]: E0314 05:28:48.563355 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.563183 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:48 crc kubenswrapper[4713]: E0314 05:28:48.563435 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:48 crc kubenswrapper[4713]: E0314 05:28:48.563577 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.568874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.568914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.568926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.568942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.568952 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.658447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" event={"ID":"8308bcb2-29df-4a11-86f3-b031e612b314","Type":"ContainerStarted","Data":"e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.659410 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.672488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.672540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.672559 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.672584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.672603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.677581 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.694055 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.701114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.716784 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.731246 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.744462 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.761642 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775412 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775424 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.775395 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.790127 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.803302 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.824633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.839694 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.852228 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.871082 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.877779 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.877824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.877833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.877849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.877859 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.881592 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.893186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.908004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.921468 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.929973 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.943039 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.957128 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.970600 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.980660 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.980719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.980742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.980773 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.980795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:48Z","lastTransitionTime":"2026-03-14T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:48 crc kubenswrapper[4713]: I0314 05:28:48.982769 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:48Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.005818 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:49Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.018077 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:49Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.028601 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:49Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.040404 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:49Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.084255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.084288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.084297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.084310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.084319 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.187140 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.187173 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.187181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.187194 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.187214 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.290356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.290390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.290398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.290414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.290421 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.392605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.392674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.392684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.392702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.392713 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.494530 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.494571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.494579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.494593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.494602 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.597123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.597171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.597183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.597224 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.597240 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.699674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.699704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.699713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.699727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.699735 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.802395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.802427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.802436 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.802449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.802458 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.904224 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.904274 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.904288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.904305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:49 crc kubenswrapper[4713]: I0314 05:28:49.904315 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:49Z","lastTransitionTime":"2026-03-14T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.006791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.006842 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.006854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.006872 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.006885 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.109149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.109238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.109270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.109292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.109304 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.211616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.211684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.211707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.211734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.211756 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.313898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.313954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.313972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.313995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.314012 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.417494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.417552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.417571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.417595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.417614 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.520347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.520391 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.520402 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.520418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.520435 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.563025 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.563034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.563034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:50 crc kubenswrapper[4713]: E0314 05:28:50.563376 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:50 crc kubenswrapper[4713]: E0314 05:28:50.563440 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:50 crc kubenswrapper[4713]: E0314 05:28:50.563199 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.623648 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.623677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.623688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.623703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.623716 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.666708 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/0.log" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.670380 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762" exitCode=1 Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.670419 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.671351 4713 scope.go:117] "RemoveContainer" containerID="399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.694703 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.715481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.725571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.725594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.725602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.725614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.725623 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.728667 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.741894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.754265 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.765007 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.778282 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.790370 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.802170 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.828030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.828075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.828086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.828105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.828116 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.832776 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:49Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:28:49.869106 6520 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 05:28:49.869124 6520 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 05:28:49.869134 6520 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 05:28:49.869157 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:28:49.869160 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:28:49.869185 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 05:28:49.869224 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:28:49.869237 6520 factory.go:656] Stopping watch factory\\\\nI0314 05:28:49.869247 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:28:49.869264 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:28:49.869263 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 05:28:49.869270 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:28:49.869274 6520 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 05:28:49.869277 6520 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 05:28:49.869280 6520 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 05:28:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.850433 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.865120 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.878584 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:50Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.931103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.931155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.931165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.931230 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:50 crc kubenswrapper[4713]: I0314 05:28:50.931243 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:50Z","lastTransitionTime":"2026-03-14T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.035915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.035968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.035977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.035991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.036018 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.138109 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.138152 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.138167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.138184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.138194 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.239667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.239693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.239701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.239713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.239731 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.341613 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.341659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.341670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.341686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.341698 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.444671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.444700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.444709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.444722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.444730 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.546882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.546932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.546940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.546954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.546964 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.649848 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.649914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.649932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.649958 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.649975 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.675710 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/1.log" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.676702 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/0.log" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.679739 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991" exitCode=1 Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.679800 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.679865 4713 scope.go:117] "RemoveContainer" containerID="399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.680904 4713 scope.go:117] "RemoveContainer" containerID="4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991" Mar 14 05:28:51 crc kubenswrapper[4713]: E0314 05:28:51.681162 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.699542 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.721463 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.740290 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.752705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.752762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.752777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.752797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.752813 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.760920 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:49Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:28:49.869106 6520 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 05:28:49.869124 6520 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 05:28:49.869134 6520 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 05:28:49.869157 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:28:49.869160 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:28:49.869185 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 05:28:49.869224 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:28:49.869237 6520 factory.go:656] Stopping watch factory\\\\nI0314 05:28:49.869247 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:28:49.869264 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:28:49.869263 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 05:28:49.869270 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:28:49.869274 6520 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 05:28:49.869277 6520 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 05:28:49.869280 6520 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 05:28:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.775222 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.793457 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.806746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.819720 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.832697 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.843651 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.852689 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.855163 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.855188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.855197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.855227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.855236 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.861509 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.871849 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.957626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.957664 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.957673 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.957686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:51 crc kubenswrapper[4713]: I0314 05:28:51.957694 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:51Z","lastTransitionTime":"2026-03-14T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.060091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.060137 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.060149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.060166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.060177 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.163461 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.163532 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.163554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.163582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.163600 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.266600 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.266911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.266978 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.267042 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.267099 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.370718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.370950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.371028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.371101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.371164 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.474579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.474657 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.474697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.474737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.474767 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.572719 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.572866 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.572720 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:52 crc kubenswrapper[4713]: E0314 05:28:52.572991 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:52 crc kubenswrapper[4713]: E0314 05:28:52.573153 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:52 crc kubenswrapper[4713]: E0314 05:28:52.573360 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.579166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.579268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.579289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.579321 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.579342 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.680569 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2"] Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.681686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.684453 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.684534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.684562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.684605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.684630 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.685715 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.688103 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.689620 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/1.log" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.697335 4713 scope.go:117] "RemoveContainer" containerID="4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991" Mar 14 05:28:52 crc kubenswrapper[4713]: E0314 05:28:52.697785 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.711355 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.734516 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.756227 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.774405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.787292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.787534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.787617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.787716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.787795 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.793728 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399020a03797e2d204bf408f75c13115a07396e9c87d9e050cc15a6ac3f5b762\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:49Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:28:49.869106 6520 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 05:28:49.869124 6520 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 05:28:49.869134 6520 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 05:28:49.869157 6520 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:28:49.869160 6520 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:28:49.869185 6520 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 05:28:49.869224 6520 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:28:49.869237 6520 factory.go:656] Stopping watch factory\\\\nI0314 05:28:49.869247 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:28:49.869264 6520 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:28:49.869263 6520 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 05:28:49.869270 6520 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:28:49.869274 6520 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 05:28:49.869277 6520 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 05:28:49.869280 6520 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 05:28:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.803008 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.803076 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct8dd\" (UniqueName: \"kubernetes.io/projected/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-kube-api-access-ct8dd\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.803097 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.803127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.812873 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.826088 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.836821 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.846577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.858746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.870311 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.883169 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.890195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.890525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.890619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.890708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.890794 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.896575 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.903877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.903942 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.903972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct8dd\" (UniqueName: \"kubernetes.io/projected/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-kube-api-access-ct8dd\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.903993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.904541 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.904714 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.908114 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.909796 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.921029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct8dd\" (UniqueName: \"kubernetes.io/projected/29ffc9b1-a736-461e-a7e2-14dca8f3a9e0-kube-api-access-ct8dd\") pod \"ovnkube-control-plane-749d76644c-bnsh2\" (UID: \"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.921913 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.936308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.947922 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.958524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.970427 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.981156 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.993915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.993971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.993981 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.993997 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:52 crc kubenswrapper[4713]: I0314 05:28:52.994007 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:52Z","lastTransitionTime":"2026-03-14T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.000607 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:52Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.010808 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.020424 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: W0314 05:28:53.023736 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ffc9b1_a736_461e_a7e2_14dca8f3a9e0.slice/crio-9e6874b305253e25ada4c0c9e7f70c56669e462a04954a1eb2b0b8ecccc3523f WatchSource:0}: Error finding container 9e6874b305253e25ada4c0c9e7f70c56669e462a04954a1eb2b0b8ecccc3523f: Status 404 returned error can't find the container with id 9e6874b305253e25ada4c0c9e7f70c56669e462a04954a1eb2b0b8ecccc3523f Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.034440 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.043866 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.068189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.088382 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.104696 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.104733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.104744 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.104763 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.104776 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.114473 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.126247 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.207189 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.207252 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.207266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.207285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.207298 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.310989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.311022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.311034 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.311047 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.311056 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.412936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.412982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.412992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.413008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.413019 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.439863 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2t6mv"] Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.440295 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: E0314 05:28:53.440356 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.455463 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.470633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.488226 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.498925 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.508736 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.509008 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwlz\" (UniqueName: \"kubernetes.io/projected/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-kube-api-access-mdwlz\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.509084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.515655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.515695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.515707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.515724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.515737 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.519425 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.531502 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.543558 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.554325 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.564169 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.571392 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.573885 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.587696 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.597752 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.610132 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwlz\" (UniqueName: \"kubernetes.io/projected/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-kube-api-access-mdwlz\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.610183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: E0314 05:28:53.610304 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:53 crc kubenswrapper[4713]: E0314 05:28:53.610362 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:54.11034252 +0000 UTC m=+117.198251810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.613397 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.617700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.617736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.617746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.617762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.617772 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.624230 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.626990 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwlz\" (UniqueName: \"kubernetes.io/projected/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-kube-api-access-mdwlz\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.698333 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" event={"ID":"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0","Type":"ContainerStarted","Data":"dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.698393 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" event={"ID":"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0","Type":"ContainerStarted","Data":"634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.698403 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" event={"ID":"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0","Type":"ContainerStarted","Data":"9e6874b305253e25ada4c0c9e7f70c56669e462a04954a1eb2b0b8ecccc3523f"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.712915 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.720330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.720348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.720356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.720367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.720375 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.724852 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.737951 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.747662 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.756753 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.766983 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.782705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.796918 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.807933 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.821563 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.822579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.822627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.822646 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.822667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.822683 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.833472 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.845194 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.856025 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.866460 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.875794 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.885811 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.925573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.925634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.925646 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.925683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:53 crc kubenswrapper[4713]: I0314 05:28:53.925695 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:53Z","lastTransitionTime":"2026-03-14T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.028399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.028706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.028803 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.028940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.029020 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.115602 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.115897 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.116130 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:55.11609882 +0000 UTC m=+118.204008160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.131117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.131172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.131191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.131245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.131265 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.234174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.234285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.234304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.234332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.234352 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.270033 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.295591 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.329420 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.337154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.337219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.337228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.337242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.337254 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.343927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.357893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.370746 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.389101 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.408104 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.418928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.419069 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.419117 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.419157 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.419266 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419436 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419487 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419523 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:10.419501145 +0000 UTC m=+133.507410475 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419680 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419710 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:29:10.4196452 +0000 UTC m=+133.507554530 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419723 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419751 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419754 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:10.419732693 +0000 UTC m=+133.507642153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419817 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:10.419796815 +0000 UTC m=+133.507706145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419686 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419875 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419904 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.419981 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:10.41995797 +0000 UTC m=+133.507867340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.429768 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.439709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.439752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.439763 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.439776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.439786 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.450544 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.465516 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.480257 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.498988 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.516359 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.534188 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.542789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.542835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.542852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.542874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.542892 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.551007 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.562842 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.562862 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.562971 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.562847 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.563093 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:54 crc kubenswrapper[4713]: E0314 05:28:54.563235 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.569351 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.645581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.646270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.646354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.646446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.646520 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.749419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.749481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.749500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.749523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.749541 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.853052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.853369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.853387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.853408 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.853421 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.955765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.955812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.955838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.955855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:54 crc kubenswrapper[4713]: I0314 05:28:54.955867 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:54Z","lastTransitionTime":"2026-03-14T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.013591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.013674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.013699 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.013733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.013755 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.029040 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:55Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.033561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.033599 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.033612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.033629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.033644 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.047433 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:55Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.051390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.051430 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.051442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.051459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.051472 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.070815 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:55Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.075240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.075273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.075284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.075299 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.075310 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.088281 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:55Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.091965 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.092002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.092013 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.092028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.092039 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.103939 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:55Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.104074 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.106592 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.106618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.106625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.106639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.106648 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.126268 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.126430 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.126521 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:28:57.126497252 +0000 UTC m=+120.214406572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.208661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.208696 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.208707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.208722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.208733 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.311288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.311325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.311337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.311355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.311367 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.414764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.414824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.414849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.414882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.414897 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.518777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.518836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.518849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.518868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.518881 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.562707 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:55 crc kubenswrapper[4713]: E0314 05:28:55.562853 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.621807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.622077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.622184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.622285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.622350 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.725284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.725315 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.725323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.725337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.725347 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.827113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.827161 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.827176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.827197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.827234 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.929137 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.929426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.929525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.929609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:55 crc kubenswrapper[4713]: I0314 05:28:55.929673 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:55Z","lastTransitionTime":"2026-03-14T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.031521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.031561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.031573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.031589 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.031602 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.133629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.133892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.134195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.134298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.134378 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.238705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.238993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.239081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.239189 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.239310 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.343190 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.343256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.343269 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.343288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.343300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.446001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.446059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.446071 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.446088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.446101 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.548402 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.548453 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.548467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.548487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.548500 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.563362 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.563373 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.563414 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:56 crc kubenswrapper[4713]: E0314 05:28:56.563822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:56 crc kubenswrapper[4713]: E0314 05:28:56.563644 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:56 crc kubenswrapper[4713]: E0314 05:28:56.564002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.650652 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.650716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.650732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.650752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.650766 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.754350 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.754410 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.754422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.754443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.754459 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.857286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.857377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.857400 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.857435 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.857460 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.960622 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.960946 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.961056 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.961155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:56 crc kubenswrapper[4713]: I0314 05:28:56.961279 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:56Z","lastTransitionTime":"2026-03-14T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.063915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.063989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.064012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.064045 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.064063 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:57Z","lastTransitionTime":"2026-03-14T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.151813 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:57 crc kubenswrapper[4713]: E0314 05:28:57.151957 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:57 crc kubenswrapper[4713]: E0314 05:28:57.152032 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:01.152007171 +0000 UTC m=+124.239916491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.166261 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.166289 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.166298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.166311 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.166320 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:57Z","lastTransitionTime":"2026-03-14T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.269636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.269696 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.269718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.269742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.269760 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:57Z","lastTransitionTime":"2026-03-14T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.372414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.372488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.372510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.372539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.372558 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:57Z","lastTransitionTime":"2026-03-14T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.475914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.475981 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.476001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.476032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.476054 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:28:57Z","lastTransitionTime":"2026-03-14T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.563052 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:57 crc kubenswrapper[4713]: E0314 05:28:57.563436 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:28:57 crc kubenswrapper[4713]: E0314 05:28:57.576324 4713 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.587483 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.610157 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.626773 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.642899 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.655768 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: E0314 05:28:57.664989 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.672381 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.692381 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.707590 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.723391 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.738889 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.754930 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.774824 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.794950 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.818258 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.836898 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:57 crc kubenswrapper[4713]: I0314 05:28:57.869840 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:28:58 crc kubenswrapper[4713]: I0314 05:28:58.563721 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:28:58 crc kubenswrapper[4713]: I0314 05:28:58.563801 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:28:58 crc kubenswrapper[4713]: E0314 05:28:58.563929 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:28:58 crc kubenswrapper[4713]: E0314 05:28:58.564019 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:28:58 crc kubenswrapper[4713]: I0314 05:28:58.564105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:28:58 crc kubenswrapper[4713]: E0314 05:28:58.564406 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:28:59 crc kubenswrapper[4713]: I0314 05:28:59.563598 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:28:59 crc kubenswrapper[4713]: E0314 05:28:59.563775 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:00 crc kubenswrapper[4713]: I0314 05:29:00.562816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:00 crc kubenswrapper[4713]: E0314 05:29:00.563615 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:00 crc kubenswrapper[4713]: I0314 05:29:00.562912 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:00 crc kubenswrapper[4713]: E0314 05:29:00.563742 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:00 crc kubenswrapper[4713]: I0314 05:29:00.562835 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:00 crc kubenswrapper[4713]: E0314 05:29:00.563835 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:01 crc kubenswrapper[4713]: I0314 05:29:01.200130 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:01 crc kubenswrapper[4713]: E0314 05:29:01.200380 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:01 crc kubenswrapper[4713]: E0314 05:29:01.200464 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:09.200438376 +0000 UTC m=+132.288347686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:01 crc kubenswrapper[4713]: I0314 05:29:01.563170 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:01 crc kubenswrapper[4713]: E0314 05:29:01.563514 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:02 crc kubenswrapper[4713]: I0314 05:29:02.563126 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:02 crc kubenswrapper[4713]: I0314 05:29:02.563274 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:02 crc kubenswrapper[4713]: I0314 05:29:02.563262 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:02 crc kubenswrapper[4713]: E0314 05:29:02.563369 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:02 crc kubenswrapper[4713]: E0314 05:29:02.563444 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:02 crc kubenswrapper[4713]: E0314 05:29:02.563605 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:02 crc kubenswrapper[4713]: E0314 05:29:02.666507 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.562689 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:03 crc kubenswrapper[4713]: E0314 05:29:03.562920 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.564247 4713 scope.go:117] "RemoveContainer" containerID="4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.736110 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/1.log" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.738604 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e"} Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.739095 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.759648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.772367 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.789502 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.809303 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.825313 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.843300 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.855181 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.864934 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.874647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.887301 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.900387 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.919531 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.936155 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.952255 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.972330 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:03 crc kubenswrapper[4713]: I0314 05:29:03.995864 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.563675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.563675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.563832 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:04 crc kubenswrapper[4713]: E0314 05:29:04.563976 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:04 crc kubenswrapper[4713]: E0314 05:29:04.564132 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:04 crc kubenswrapper[4713]: E0314 05:29:04.564294 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.760876 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/2.log" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.761559 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/1.log" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.764527 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" exitCode=1 Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.764597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e"} Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.764642 4713 scope.go:117] "RemoveContainer" containerID="4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.765419 4713 scope.go:117] "RemoveContainer" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" Mar 14 05:29:04 crc kubenswrapper[4713]: E0314 05:29:04.765646 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.789984 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d9eda5e78cf2d862caebd4af17083c3cb4e9acc6053561531c652f94c98d991\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:28:51Z\\\",\\\"message\\\":\\\"k controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:28:51Z is after 2025-08-24T17:21:41Z]\\\\nI0314 05:28:51.493270 6684 services_controller.go:434] Service openshift-ingress-canary/ingress-canary retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{ingress-canary openshift-ingress-canary 15635066-0a58-424e-b02f-8a2d0ad3c482 10679 0 2025-02-23 05:35:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingress.openshift.io/canary:canary_controller] map[service.beta.openshift.io/serving-cert-secret-name:canary-serving-cert] [{apps/v1 daemonset ingress-canary f5a2759b-dc3c-483d-93f0-055bac962b12 0xc0071f8da7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:8443-tcp,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:8888-tcp,Protocol:TCP,Port:8888,TargetPort:{0 8888 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{ingresscanary.operator.openshift.io/daemonset-ingresscanary: canary_controller,},ClusterIP:10.217.5.34,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.801383 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.816055 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.831030 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.841275 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.850643 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.861002 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.869766 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.876683 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.885935 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.898945 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.910568 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.922910 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.932798 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.940138 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:04 crc kubenswrapper[4713]: I0314 05:29:04.955459 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.321399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.321446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.321460 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.321477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.321490 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:05Z","lastTransitionTime":"2026-03-14T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.332148 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.334940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.334971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.334979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.334993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.335003 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:05Z","lastTransitionTime":"2026-03-14T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.345589 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.349747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.349784 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.349800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.349814 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.349824 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:05Z","lastTransitionTime":"2026-03-14T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.361843 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.365043 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.365084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.365093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.365112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.365123 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:05Z","lastTransitionTime":"2026-03-14T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.376770 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.380270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.380298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.380308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.380322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.380332 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:05Z","lastTransitionTime":"2026-03-14T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.390276 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.390376 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.563174 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.563364 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.768511 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/2.log" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.771563 4713 scope.go:117] "RemoveContainer" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" Mar 14 05:29:05 crc kubenswrapper[4713]: E0314 05:29:05.771719 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.781375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.790677 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.798867 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.806852 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.815274 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.824956 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.834606 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.848349 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.858331 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.867030 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.877369 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.888359 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.904032 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.919718 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.935468 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:05 crc kubenswrapper[4713]: I0314 05:29:05.949051 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:05Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:06 crc kubenswrapper[4713]: I0314 05:29:06.562982 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:06 crc kubenswrapper[4713]: I0314 05:29:06.563018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:06 crc kubenswrapper[4713]: I0314 05:29:06.563018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:06 crc kubenswrapper[4713]: E0314 05:29:06.563169 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:06 crc kubenswrapper[4713]: E0314 05:29:06.563284 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:06 crc kubenswrapper[4713]: E0314 05:29:06.563445 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.562831 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:07 crc kubenswrapper[4713]: E0314 05:29:07.563028 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.595938 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.616133 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.630999 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.646612 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.658059 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: E0314 05:29:07.667004 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.674010 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.685368 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.704418 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.716016 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.730717 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.746756 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.759669 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.770986 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.784796 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.797491 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:07 crc kubenswrapper[4713]: I0314 05:29:07.818524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:08 crc kubenswrapper[4713]: I0314 05:29:08.563516 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:08 crc kubenswrapper[4713]: I0314 05:29:08.563562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:08 crc kubenswrapper[4713]: I0314 05:29:08.563598 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:08 crc kubenswrapper[4713]: E0314 05:29:08.564718 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:08 crc kubenswrapper[4713]: E0314 05:29:08.564841 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:08 crc kubenswrapper[4713]: E0314 05:29:08.564946 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:09 crc kubenswrapper[4713]: I0314 05:29:09.202331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:09 crc kubenswrapper[4713]: E0314 05:29:09.202493 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:09 crc kubenswrapper[4713]: E0314 05:29:09.202595 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:25.20257379 +0000 UTC m=+148.290483090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:09 crc kubenswrapper[4713]: I0314 05:29:09.563323 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:09 crc kubenswrapper[4713]: E0314 05:29:09.563543 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.517492 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.517720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.517758 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:29:42.517727304 +0000 UTC m=+165.605636634 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.517801 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.517850 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.517936 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:42.517911029 +0000 UTC m=+165.605820369 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.517965 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.517988 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.517851 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518007 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.518171 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518095 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518289 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:42.518250369 +0000 UTC m=+165.606159699 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518323 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518338 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518426 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:42.518402633 +0000 UTC m=+165.606312013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518345 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.518538 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:42.518513836 +0000 UTC m=+165.606423176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.562615 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.562673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.562773 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:10 crc kubenswrapper[4713]: I0314 05:29:10.562629 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.562907 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:10 crc kubenswrapper[4713]: E0314 05:29:10.563005 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:11 crc kubenswrapper[4713]: I0314 05:29:11.563653 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:11 crc kubenswrapper[4713]: E0314 05:29:11.563848 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:12 crc kubenswrapper[4713]: I0314 05:29:12.563235 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:12 crc kubenswrapper[4713]: E0314 05:29:12.563369 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:12 crc kubenswrapper[4713]: I0314 05:29:12.563434 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:12 crc kubenswrapper[4713]: I0314 05:29:12.563440 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:12 crc kubenswrapper[4713]: E0314 05:29:12.563563 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:12 crc kubenswrapper[4713]: E0314 05:29:12.563749 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:12 crc kubenswrapper[4713]: E0314 05:29:12.668420 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:13 crc kubenswrapper[4713]: I0314 05:29:13.563075 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:13 crc kubenswrapper[4713]: E0314 05:29:13.563354 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:14 crc kubenswrapper[4713]: I0314 05:29:14.563656 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:14 crc kubenswrapper[4713]: I0314 05:29:14.563708 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:14 crc kubenswrapper[4713]: I0314 05:29:14.563715 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:14 crc kubenswrapper[4713]: E0314 05:29:14.563927 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:14 crc kubenswrapper[4713]: E0314 05:29:14.563992 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:14 crc kubenswrapper[4713]: E0314 05:29:14.564053 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.562648 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.562851 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.745821 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.745876 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.745896 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.745923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.745942 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:15Z","lastTransitionTime":"2026-03-14T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.766443 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:15Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.770309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.770375 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.770392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.770417 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.770434 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:15Z","lastTransitionTime":"2026-03-14T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.788049 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:15Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.792782 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.792819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.792832 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.792851 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.792862 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:15Z","lastTransitionTime":"2026-03-14T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.806634 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:15Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.810171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.810197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.810245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.810260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.810272 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:15Z","lastTransitionTime":"2026-03-14T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.823512 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:15Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.827298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.827326 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.827339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.827354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:15 crc kubenswrapper[4713]: I0314 05:29:15.827366 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:15Z","lastTransitionTime":"2026-03-14T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.840889 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:15Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:15 crc kubenswrapper[4713]: E0314 05:29:15.841097 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:16 crc kubenswrapper[4713]: I0314 05:29:16.563493 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:16 crc kubenswrapper[4713]: I0314 05:29:16.563600 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:16 crc kubenswrapper[4713]: E0314 05:29:16.563665 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:16 crc kubenswrapper[4713]: I0314 05:29:16.563718 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:16 crc kubenswrapper[4713]: E0314 05:29:16.563924 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:16 crc kubenswrapper[4713]: E0314 05:29:16.564044 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.562875 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:17 crc kubenswrapper[4713]: E0314 05:29:17.563049 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.577100 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.591429 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.604884 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.618491 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.628476 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.637834 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.651373 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.668730 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: E0314 05:29:17.668932 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.686781 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.701243 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.715413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.732401 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.756216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.769994 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.784098 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:17 crc kubenswrapper[4713]: I0314 05:29:17.799892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:17Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:18 crc kubenswrapper[4713]: I0314 05:29:18.563529 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:18 crc kubenswrapper[4713]: I0314 05:29:18.563529 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:18 crc kubenswrapper[4713]: E0314 05:29:18.563790 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:18 crc kubenswrapper[4713]: E0314 05:29:18.563891 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:18 crc kubenswrapper[4713]: I0314 05:29:18.563549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:18 crc kubenswrapper[4713]: E0314 05:29:18.564110 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:19 crc kubenswrapper[4713]: I0314 05:29:19.562856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:19 crc kubenswrapper[4713]: E0314 05:29:19.567840 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:19 crc kubenswrapper[4713]: I0314 05:29:19.569400 4713 scope.go:117] "RemoveContainer" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" Mar 14 05:29:19 crc kubenswrapper[4713]: E0314 05:29:19.569724 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:20 crc kubenswrapper[4713]: I0314 05:29:20.562815 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:20 crc kubenswrapper[4713]: I0314 05:29:20.562851 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:20 crc kubenswrapper[4713]: I0314 05:29:20.562955 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:20 crc kubenswrapper[4713]: E0314 05:29:20.563086 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:20 crc kubenswrapper[4713]: E0314 05:29:20.563263 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:20 crc kubenswrapper[4713]: E0314 05:29:20.563342 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:21 crc kubenswrapper[4713]: I0314 05:29:21.563036 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:21 crc kubenswrapper[4713]: E0314 05:29:21.563257 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:22 crc kubenswrapper[4713]: I0314 05:29:22.563672 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:22 crc kubenswrapper[4713]: I0314 05:29:22.563723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:22 crc kubenswrapper[4713]: E0314 05:29:22.563917 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:22 crc kubenswrapper[4713]: I0314 05:29:22.563723 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:22 crc kubenswrapper[4713]: E0314 05:29:22.564148 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:22 crc kubenswrapper[4713]: E0314 05:29:22.564375 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:22 crc kubenswrapper[4713]: E0314 05:29:22.670915 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:23 crc kubenswrapper[4713]: I0314 05:29:23.563327 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:23 crc kubenswrapper[4713]: E0314 05:29:23.563722 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:24 crc kubenswrapper[4713]: I0314 05:29:24.563443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:24 crc kubenswrapper[4713]: I0314 05:29:24.563519 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:24 crc kubenswrapper[4713]: I0314 05:29:24.563443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:24 crc kubenswrapper[4713]: E0314 05:29:24.563602 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:24 crc kubenswrapper[4713]: E0314 05:29:24.563737 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:24 crc kubenswrapper[4713]: E0314 05:29:24.563851 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.283712 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:25 crc kubenswrapper[4713]: E0314 05:29:25.283855 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:25 crc kubenswrapper[4713]: E0314 05:29:25.283947 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:29:57.283925221 +0000 UTC m=+180.371834521 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.563318 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:25 crc kubenswrapper[4713]: E0314 05:29:25.563470 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.989127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.989169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.989178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.989195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:25 crc kubenswrapper[4713]: I0314 05:29:25.989229 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:25Z","lastTransitionTime":"2026-03-14T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.001356 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:25Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.006017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.006085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.006111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.006140 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.006161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:26Z","lastTransitionTime":"2026-03-14T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.025500 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:26Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.030566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.030619 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.030637 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.030661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.030680 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:26Z","lastTransitionTime":"2026-03-14T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.049142 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:26Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.053346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.053381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.053394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.053415 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.053429 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:26Z","lastTransitionTime":"2026-03-14T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.073594 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:26Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.076940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.076976 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.076990 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.077009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.077020 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:26Z","lastTransitionTime":"2026-03-14T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.091005 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:26Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.091224 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.563363 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.563436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.563516 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:26 crc kubenswrapper[4713]: I0314 05:29:26.563531 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.563697 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:26 crc kubenswrapper[4713]: E0314 05:29:26.563767 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.562901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:27 crc kubenswrapper[4713]: E0314 05:29:27.563141 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.585183 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.607802 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.626522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.639555 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.651796 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.669704 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: E0314 05:29:27.671611 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.687938 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.707577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.722121 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.738243 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.754336 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.771655 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.789715 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.805270 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.817451 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.837440 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.844895 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/0.log" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.844958 4713 generic.go:334] "Generic (PLEG): container finished" podID="703b6542-1a83-442a-9673-6a774399dd7e" containerID="ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78" exitCode=1 Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.845005 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerDied","Data":"ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78"} Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.845491 4713 scope.go:117] "RemoveContainer" containerID="ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.860774 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.877654 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.903279 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.922907 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.944653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.960658 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:27 crc kubenswrapper[4713]: I0314 05:29:27.987884 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:27Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.008612 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.026532 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.042633 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.058750 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.072574 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.087572 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.110183 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.130789 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.145631 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.563882 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.563896 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:28 crc kubenswrapper[4713]: E0314 05:29:28.564121 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:28 crc kubenswrapper[4713]: E0314 05:29:28.564246 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.564466 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:28 crc kubenswrapper[4713]: E0314 05:29:28.564625 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.851363 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/0.log" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.851431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerStarted","Data":"430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8"} Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.874185 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.897649 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.915846 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.928914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.940569 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.950822 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.963280 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.985566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:28 crc kubenswrapper[4713]: I0314 05:29:28.998421 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:28Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.019136 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.031288 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.046020 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.061830 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.080430 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.093860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.114784 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:29Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:29 crc kubenswrapper[4713]: I0314 05:29:29.562902 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:29 crc kubenswrapper[4713]: E0314 05:29:29.563104 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:30 crc kubenswrapper[4713]: I0314 05:29:30.563684 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:30 crc kubenswrapper[4713]: E0314 05:29:30.564605 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:30 crc kubenswrapper[4713]: I0314 05:29:30.563948 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:30 crc kubenswrapper[4713]: E0314 05:29:30.564800 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:30 crc kubenswrapper[4713]: I0314 05:29:30.563787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:30 crc kubenswrapper[4713]: E0314 05:29:30.564916 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:31 crc kubenswrapper[4713]: I0314 05:29:31.563631 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:31 crc kubenswrapper[4713]: E0314 05:29:31.563900 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:32 crc kubenswrapper[4713]: I0314 05:29:32.563394 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:32 crc kubenswrapper[4713]: I0314 05:29:32.563452 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:32 crc kubenswrapper[4713]: I0314 05:29:32.563508 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:32 crc kubenswrapper[4713]: E0314 05:29:32.563596 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:32 crc kubenswrapper[4713]: E0314 05:29:32.563759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:32 crc kubenswrapper[4713]: E0314 05:29:32.563889 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:32 crc kubenswrapper[4713]: E0314 05:29:32.673437 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:33 crc kubenswrapper[4713]: I0314 05:29:33.563319 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:33 crc kubenswrapper[4713]: E0314 05:29:33.563572 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.563258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.563347 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.563447 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:34 crc kubenswrapper[4713]: E0314 05:29:34.563494 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:34 crc kubenswrapper[4713]: E0314 05:29:34.563811 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:34 crc kubenswrapper[4713]: E0314 05:29:34.563960 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.565109 4713 scope.go:117] "RemoveContainer" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.876281 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/2.log" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.880152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339"} Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.880724 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.897336 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:34Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.928016 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:34Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.956885 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:34Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.979135 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:34Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:34 crc kubenswrapper[4713]: I0314 05:29:34.994442 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:34Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.014512 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.037152 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.059599 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.078313 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.095815 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.113366 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.126247 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.137319 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.148654 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.158573 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.170373 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.563661 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:35 crc kubenswrapper[4713]: E0314 05:29:35.563937 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.580173 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.887675 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/3.log" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.888941 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/2.log" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.893145 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" exitCode=1 Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.893313 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339"} Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.893435 4713 scope.go:117] "RemoveContainer" containerID="94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.894917 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:29:35 crc kubenswrapper[4713]: E0314 05:29:35.895271 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.915993 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.934449 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.949735 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.963092 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.977789 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:35 crc kubenswrapper[4713]: I0314 05:29:35.992464 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.008085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.024189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.041014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.059613 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.081116 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.102372 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.119098 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.134140 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.146175 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.158278 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.166307 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.166356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.166369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.166391 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.166407 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:36Z","lastTransitionTime":"2026-03-14T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.179612 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.181857 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94336a69590c9f8092cc70dbf574836aa1c4ddd2ceaf5857538edfdf248f398e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:04Z\\\",\\\"message\\\":\\\"fter 0 failed attempt(s)\\\\nI0314 05:29:04.419535 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0314 05:29:04.419579 6937 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4ds64\\\\nI0314 05:29:04.419576 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-h4rjf\\\\nI0314 05:29:04.419536 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5lt5l\\\\nI0314 05:29:04.419584 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0314 05:29:04.419590 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-sx769\\\\nI0314 05:29:04.419595 6937 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2\\\\nI0314 05:29:04.419604 6937 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF0314 05:29:04.419622 6937 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, h\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.184106 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.184161 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.184182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.184203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.184243 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:36Z","lastTransitionTime":"2026-03-14T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.199170 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.202741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.202795 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.202808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.202829 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.202844 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:36Z","lastTransitionTime":"2026-03-14T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.218636 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.223009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.223059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.223075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.223099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.223114 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:36Z","lastTransitionTime":"2026-03-14T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.236301 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.239965 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.240008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.240018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.240036 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.240049 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:36Z","lastTransitionTime":"2026-03-14T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.253034 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.253186 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.563544 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.563637 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.563558 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.563795 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.563951 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.564133 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.899265 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/3.log" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.905010 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:29:36 crc kubenswrapper[4713]: E0314 05:29:36.905223 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.920794 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.935273 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.950942 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.970268 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.980399 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:36 crc kubenswrapper[4713]: I0314 05:29:36.988531 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.006684 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.021730 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.032981 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.042770 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.051769 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.061828 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.074571 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.087579 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.100454 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.110535 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.122503 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.563665 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:37 crc kubenswrapper[4713]: E0314 05:29:37.563822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.593175 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.608423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.623991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.637285 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.650620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.664964 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: E0314 05:29:37.673965 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.684756 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.695883 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.705738 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.723396 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.733112 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.745652 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.765877 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.781441 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.792332 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.804012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:37 crc kubenswrapper[4713]: I0314 05:29:37.817949 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:38 crc kubenswrapper[4713]: I0314 05:29:38.563160 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:38 crc kubenswrapper[4713]: E0314 05:29:38.563411 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:38 crc kubenswrapper[4713]: I0314 05:29:38.563444 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:38 crc kubenswrapper[4713]: E0314 05:29:38.563977 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:38 crc kubenswrapper[4713]: I0314 05:29:38.563538 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:38 crc kubenswrapper[4713]: E0314 05:29:38.564423 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:38 crc kubenswrapper[4713]: I0314 05:29:38.577739 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 14 05:29:39 crc kubenswrapper[4713]: I0314 05:29:39.563076 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:39 crc kubenswrapper[4713]: E0314 05:29:39.563284 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:40 crc kubenswrapper[4713]: I0314 05:29:40.563614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:40 crc kubenswrapper[4713]: E0314 05:29:40.564325 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:40 crc kubenswrapper[4713]: I0314 05:29:40.563675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:40 crc kubenswrapper[4713]: E0314 05:29:40.564562 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:40 crc kubenswrapper[4713]: I0314 05:29:40.563674 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:40 crc kubenswrapper[4713]: E0314 05:29:40.564855 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:41 crc kubenswrapper[4713]: I0314 05:29:41.563447 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:41 crc kubenswrapper[4713]: E0314 05:29:41.564528 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.563165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.563230 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.563641 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.563789 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.563970 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.564005 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.609261 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.609344 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.609364 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.609392 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:42 crc kubenswrapper[4713]: I0314 05:29:42.609441 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609506 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.609484021 +0000 UTC m=+229.697393321 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609539 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609587 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609592 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609617 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609631 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609730 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609603 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609773 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609607 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.609594165 +0000 UTC m=+229.697503465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609848 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.609822241 +0000 UTC m=+229.697731561 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609879 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.609867002 +0000 UTC m=+229.697776322 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.609909 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.609898333 +0000 UTC m=+229.697807653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 05:29:42 crc kubenswrapper[4713]: E0314 05:29:42.675438 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:43 crc kubenswrapper[4713]: I0314 05:29:43.563442 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:43 crc kubenswrapper[4713]: E0314 05:29:43.563966 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:44 crc kubenswrapper[4713]: I0314 05:29:44.562888 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:44 crc kubenswrapper[4713]: I0314 05:29:44.562888 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:44 crc kubenswrapper[4713]: I0314 05:29:44.562926 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:44 crc kubenswrapper[4713]: E0314 05:29:44.564478 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:44 crc kubenswrapper[4713]: E0314 05:29:44.564551 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:44 crc kubenswrapper[4713]: E0314 05:29:44.564618 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:45 crc kubenswrapper[4713]: I0314 05:29:45.563729 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:45 crc kubenswrapper[4713]: E0314 05:29:45.564109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.331438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.331513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.331525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.331543 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.331561 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:46Z","lastTransitionTime":"2026-03-14T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.352633 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.358510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.358565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.358578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.358598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.358611 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:46Z","lastTransitionTime":"2026-03-14T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.385761 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.391947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.392000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.392011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.392031 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.392043 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:46Z","lastTransitionTime":"2026-03-14T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.408914 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.414399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.414465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.414481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.414507 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.414527 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:46Z","lastTransitionTime":"2026-03-14T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.439603 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.445541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.445590 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.445609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.445635 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.445649 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:46Z","lastTransitionTime":"2026-03-14T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.464028 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.464306 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.563150 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.563236 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.563559 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:46 crc kubenswrapper[4713]: I0314 05:29:46.563237 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.563746 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:46 crc kubenswrapper[4713]: E0314 05:29:46.563840 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.562828 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:47 crc kubenswrapper[4713]: E0314 05:29:47.563377 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.601812 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.628026 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.650336 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: E0314 05:29:47.676320 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.678403 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.695251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.711291 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.726526 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.741184 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.752722 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.765199 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.779832 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.793533 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.809820 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.825878 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.839593 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.853946 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.870420 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ca9d49-fdfe-4bf3-bac5-6be4144d7a69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0563ae22d9421b3f5da796c13d8cadc6514d52f1cfad39b4e2b36f6bb3ce460e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:57Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 05:27:28.155490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 05:27:28.156229 1 observer_polling.go:159] Starting file observer\\\\nI0314 05:27:28.156848 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 05:27:28.157350 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 05:27:54.801869 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0314 05:27:57.800562 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 05:27:57.801027 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context canceled\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251da253d7431414854ca2550b3bed65f90d2a661cd6c42fb6c0770bcb84421\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5a6e7fb2309ccbbdf3ee09496cec1939b803616e46bd5080e12d654f97c299\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:47 crc kubenswrapper[4713]: I0314 05:29:47.887888 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:47Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:48 crc kubenswrapper[4713]: I0314 05:29:48.563656 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:48 crc kubenswrapper[4713]: I0314 05:29:48.563768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:48 crc kubenswrapper[4713]: I0314 05:29:48.563862 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:48 crc kubenswrapper[4713]: E0314 05:29:48.564313 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:48 crc kubenswrapper[4713]: E0314 05:29:48.564556 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:48 crc kubenswrapper[4713]: E0314 05:29:48.566781 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:48 crc kubenswrapper[4713]: I0314 05:29:48.567142 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:29:48 crc kubenswrapper[4713]: E0314 05:29:48.567569 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:29:48 crc kubenswrapper[4713]: I0314 05:29:48.588645 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 14 05:29:49 crc kubenswrapper[4713]: I0314 05:29:49.563333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:49 crc kubenswrapper[4713]: E0314 05:29:49.563595 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:50 crc kubenswrapper[4713]: I0314 05:29:50.563304 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:50 crc kubenswrapper[4713]: I0314 05:29:50.563418 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:50 crc kubenswrapper[4713]: I0314 05:29:50.563461 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:50 crc kubenswrapper[4713]: E0314 05:29:50.563569 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:50 crc kubenswrapper[4713]: E0314 05:29:50.563757 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:50 crc kubenswrapper[4713]: E0314 05:29:50.563880 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:51 crc kubenswrapper[4713]: I0314 05:29:51.564281 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:51 crc kubenswrapper[4713]: E0314 05:29:51.564494 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:52 crc kubenswrapper[4713]: I0314 05:29:52.563357 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:52 crc kubenswrapper[4713]: I0314 05:29:52.563357 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:52 crc kubenswrapper[4713]: I0314 05:29:52.563437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:52 crc kubenswrapper[4713]: E0314 05:29:52.564410 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:52 crc kubenswrapper[4713]: E0314 05:29:52.564646 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:52 crc kubenswrapper[4713]: E0314 05:29:52.565266 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:52 crc kubenswrapper[4713]: E0314 05:29:52.677957 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:53 crc kubenswrapper[4713]: I0314 05:29:53.567503 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:53 crc kubenswrapper[4713]: E0314 05:29:53.568289 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:54 crc kubenswrapper[4713]: I0314 05:29:54.563054 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:54 crc kubenswrapper[4713]: I0314 05:29:54.563054 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:54 crc kubenswrapper[4713]: I0314 05:29:54.563299 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:54 crc kubenswrapper[4713]: E0314 05:29:54.563436 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:54 crc kubenswrapper[4713]: E0314 05:29:54.563822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:54 crc kubenswrapper[4713]: E0314 05:29:54.565174 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:55 crc kubenswrapper[4713]: I0314 05:29:55.563765 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:55 crc kubenswrapper[4713]: E0314 05:29:55.563986 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.562678 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.562772 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.562808 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.563714 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.563841 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.563933 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.605675 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.605716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.605727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.605743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.605753 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:56Z","lastTransitionTime":"2026-03-14T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.626047 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.631671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.631706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.631717 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.631733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.631772 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:56Z","lastTransitionTime":"2026-03-14T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.645311 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.650670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.650720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.650734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.650753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.650767 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:56Z","lastTransitionTime":"2026-03-14T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.670456 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.675337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.675372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.675384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.675403 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.675414 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:56Z","lastTransitionTime":"2026-03-14T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.691314 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.696230 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.696456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.696467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.696484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:29:56 crc kubenswrapper[4713]: I0314 05:29:56.696496 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:29:56Z","lastTransitionTime":"2026-03-14T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.708818 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:56 crc kubenswrapper[4713]: E0314 05:29:56.708932 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.294856 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:57 crc kubenswrapper[4713]: E0314 05:29:57.295051 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:57 crc kubenswrapper[4713]: E0314 05:29:57.295146 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs podName:8f2a1689-2973-4684-88d0-4ac7edb9b1d3 nodeName:}" failed. No retries permitted until 2026-03-14 05:31:01.295122688 +0000 UTC m=+244.383031998 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs") pod "network-metrics-daemon-2t6mv" (UID: "8f2a1689-2973-4684-88d0-4ac7edb9b1d3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.563376 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:57 crc kubenswrapper[4713]: E0314 05:29:57.563593 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.584001 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.598734 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.611798 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.626629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.640655 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.654240 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.668576 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: E0314 05:29:57.678771 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.691423 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b66a23a-0487-40ec-9b9a-08336208b1bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff7c162dc04b24c00fe296bd9f54f629666770337c5b5a1758ad7c1ffc2b313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8cf31a51e4aed330e2243890414f6e77e0b2376c7d10d29bb181c11a004ebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43edeb2fb633e59075be5441af86a4900252c13999743027187349c45a9818c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c28147b0d7173eb1ae6af953e5ceb55a4d79ef66d2fc79470b427a3054b8b30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15eeeed96a6933a36b7b173b63deb9670a36b6be7ab19abec94835db3e568aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.706658 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.726331 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.742783 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.755187 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.769786 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.782382 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.803389 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ca9d49-fdfe-4bf3-bac5-6be4144d7a69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0563ae22d9421b3f5da796c13d8cadc6514d52f1cfad39b4e2b36f6bb3ce460e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:57Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 05:27:28.155490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 05:27:28.156229 1 observer_polling.go:159] Starting file observer\\\\nI0314 05:27:28.156848 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 05:27:28.157350 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 05:27:54.801869 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0314 05:27:57.800562 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 05:27:57.801027 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context canceled\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251da253d7431414854ca2550b3bed65f90d2a661cd6c42fb6c0770bcb84421\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5a6e7fb2309ccbbdf3ee09496cec1939b803616e46bd5080e12d654f97c299\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.819628 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.837853 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.856280 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:57 crc kubenswrapper[4713]: I0314 05:29:57.882319 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:29:57Z is after 2025-08-24T17:21:41Z" Mar 14 05:29:58 crc kubenswrapper[4713]: I0314 05:29:58.562750 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:29:58 crc kubenswrapper[4713]: I0314 05:29:58.562888 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:29:58 crc kubenswrapper[4713]: I0314 05:29:58.562750 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:29:58 crc kubenswrapper[4713]: E0314 05:29:58.563000 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:29:58 crc kubenswrapper[4713]: E0314 05:29:58.563336 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:29:58 crc kubenswrapper[4713]: E0314 05:29:58.563411 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:29:59 crc kubenswrapper[4713]: I0314 05:29:59.563370 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:29:59 crc kubenswrapper[4713]: E0314 05:29:59.563675 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:00 crc kubenswrapper[4713]: I0314 05:30:00.563607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:00 crc kubenswrapper[4713]: I0314 05:30:00.563656 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:00 crc kubenswrapper[4713]: I0314 05:30:00.563682 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:00 crc kubenswrapper[4713]: E0314 05:30:00.563832 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:00 crc kubenswrapper[4713]: E0314 05:30:00.563968 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:00 crc kubenswrapper[4713]: E0314 05:30:00.564099 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:01 crc kubenswrapper[4713]: I0314 05:30:01.563595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:01 crc kubenswrapper[4713]: E0314 05:30:01.563747 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:02 crc kubenswrapper[4713]: I0314 05:30:02.562972 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:02 crc kubenswrapper[4713]: I0314 05:30:02.563035 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:02 crc kubenswrapper[4713]: I0314 05:30:02.563035 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:02 crc kubenswrapper[4713]: E0314 05:30:02.563334 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:02 crc kubenswrapper[4713]: E0314 05:30:02.563614 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:02 crc kubenswrapper[4713]: E0314 05:30:02.563685 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:02 crc kubenswrapper[4713]: E0314 05:30:02.680450 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:03 crc kubenswrapper[4713]: I0314 05:30:03.563717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:03 crc kubenswrapper[4713]: E0314 05:30:03.564162 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:03 crc kubenswrapper[4713]: I0314 05:30:03.565713 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:30:03 crc kubenswrapper[4713]: E0314 05:30:03.566140 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:30:04 crc kubenswrapper[4713]: I0314 05:30:04.563018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:04 crc kubenswrapper[4713]: I0314 05:30:04.563073 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:04 crc kubenswrapper[4713]: I0314 05:30:04.563272 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:04 crc kubenswrapper[4713]: E0314 05:30:04.563434 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:04 crc kubenswrapper[4713]: E0314 05:30:04.563650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:04 crc kubenswrapper[4713]: E0314 05:30:04.563992 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:05 crc kubenswrapper[4713]: I0314 05:30:05.563251 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:05 crc kubenswrapper[4713]: E0314 05:30:05.563542 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.563402 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.563542 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.563402 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:06 crc kubenswrapper[4713]: E0314 05:30:06.563665 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:06 crc kubenswrapper[4713]: E0314 05:30:06.563776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:06 crc kubenswrapper[4713]: E0314 05:30:06.563892 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.972056 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.972118 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.972137 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.972156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.972170 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:06Z","lastTransitionTime":"2026-03-14T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:06 crc kubenswrapper[4713]: E0314 05:30:06.989984 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.996522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.996610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.996627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.996651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:06 crc kubenswrapper[4713]: I0314 05:30:06.996668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:06Z","lastTransitionTime":"2026-03-14T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.015547 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.020885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.020960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.020980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.021009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.021029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:07Z","lastTransitionTime":"2026-03-14T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.036879 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.042156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.042277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.042306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.042346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.042383 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:07Z","lastTransitionTime":"2026-03-14T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.061151 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.066855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.066923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.066947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.066972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.066989 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:07Z","lastTransitionTime":"2026-03-14T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.087912 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4bde9573-5f81-4b74-9efd-4a4fc782d1ac\\\",\\\"systemUUID\\\":\\\"ebb26f6c-82bb-440f-9e91-918044368ed3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.088069 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.563150 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.563355 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.581883 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h4rjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f4a76d9-a890-4a25-bd97-411e6a8a9bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3804215000cac6a9e5200250c7e8767feeaca827c3ce929c94279504146ed92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2bzw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h4rjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.600716 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5lt5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81c9353e-167d-4f25-9c45-3649456e4263\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f30b09da05282fe25ec2597b45ca70289d79ceac242ca926aeb0bf89b6817a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hm6f5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5lt5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.615954 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c8c201b-727d-4e54-9e89-b41fc70aa438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2f424c581ebd1d65c32edf68989401404343e9dffb9b83c9d6fed80f65382b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be7b1d4c4ed7bc664b4c9fe7bbd6cd1f5c21fa61c2844eeecbafb82544d07ec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.642488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b66a23a-0487-40ec-9b9a-08336208b1bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aff7c162dc04b24c00fe296bd9f54f629666770337c5b5a1758ad7c1ffc2b313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8cf31a51e4aed330e2243890414f6e77e0b2376c7d10d29bb181c11a004ebe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43edeb2fb633e59075be5441af86a4900252c13999743027187349c45a9818c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c28147b0d7173eb1ae6af953e5ceb55a4d79ef66d2fc79470b427a3054b8b30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15eeeed96a6933a36b7b173b63deb9670a36b6be7ab19abec94835db3e568aa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b11f1baf195192fbec7d445e546c6c16e48917bcca3ed67d10f699d3f268ce2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b148a74acc00c7190f62bc47111dade546767963ad6c61880477dcfc9b591e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24f5d4239218a588477d136d37232bdb04685dd0f1b46cddb6fcce7d84465e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:27:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.673837 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: E0314 05:30:07.681186 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.693381 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.708875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.722890 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a12c1a8d87f82ed7210e82efea9619a46a12258c918e0b12d9d80fcadef9b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.740365 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdwlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2t6mv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.757684 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6c18f9f-5cde-4b5a-9b58-e7b894633573\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21900643a018ca5ce3eee21a5ccbe87996d597f7515ef455c172e0a53649739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200d525d97ed6eb47421d42df531ce164618560e2c01f5de9e65871aed4d1ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64a6432daa230b7896fa698ac2b6d9efab9d706177ec4d3908c1f1ac2376d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5be381f1fd6b291cb3952ae28789f6c1a1d6262d9f9526532b4d17cc8e05bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.776404 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2ca9d49-fdfe-4bf3-bac5-6be4144d7a69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0563ae22d9421b3f5da796c13d8cadc6514d52f1cfad39b4e2b36f6bb3ce460e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d84a02ccd79f3547e650b0c51decba1af1f2b6fa6d8fc9a0bde3f5a726d4ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:57Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 05:27:28.155490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 05:27:28.156229 1 observer_polling.go:159] Starting file observer\\\\nI0314 05:27:28.156848 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 05:27:28.157350 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 05:27:54.801869 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0314 05:27:57.800562 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 05:27:57.801027 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context canceled\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3251da253d7431414854ca2550b3bed65f90d2a661cd6c42fb6c0770bcb84421\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5a6e7fb2309ccbbdf3ee09496cec1939b803616e46bd5080e12d654f97c299\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.795362 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea52f243fd0f328a5ca04ed16425216e904cc29c495b60c249dc70feced1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.816068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f200575b97f2d476901d3f3e2750032162ac46c8ab2f6f02bcf9decec2396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f29dd9142142914d966d87ed5abd1f1c57d24750791038994641e78bbaa788d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.835508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sx769" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8308bcb2-29df-4a11-86f3-b031e612b314\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cedbcce4231e2a664377b4d9a046665565a10eeddcc533d44cd23e1f84031e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d688b0e309f8b9413cd4de9b70f3bc283ef6650929a5478e97df8c08a20112e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab035134159e1dff5f9c0e59c3886c9ef5ebb1afb565148969d1cb29c5d7f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c78fa8efd4ff3418e24dc2674ebcef4f0253f20dbdf9e74a45da58b730b53638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98f4e268b49890c250b9f67fc91205f48bbd103b3d753665270585bbda05ad6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://899d1e30e8e024575eda3eadf2e0f8b08d5652eecf6d6fb2a873ffb3a6ae4c1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394320e013fbd2a307d9de99204e8ce50a464919ac3bf65b06634872213f63ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5c5t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sx769\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.854190 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29ffc9b1-a736-461e-a7e2-14dca8f3a9e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634b2d4306a48d34e21d50a32ed0c1336262293d83ae2fa7a698e7957a64fb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2333436cdc40f89903b570f029a0e816d10b59ab0a9f0c45d8e8cba3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct8dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bnsh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.877945 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6632626e-d806-4de3-b20a-6ee10099a464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:35Z\\\",\\\"message\\\":\\\"rs/externalversions/factory.go:141\\\\nI0314 05:29:35.498588 7270 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 05:29:35.499042 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 05:29:35.499085 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 05:29:35.499119 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 05:29:35.499174 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 05:29:35.499190 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 05:29:35.499220 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 05:29:35.499249 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 05:29:35.499274 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 05:29:35.499254 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 05:29:35.499341 7270 factory.go:656] Stopping watch factory\\\\nI0314 05:29:35.499365 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0314 05:29:35.499371 7270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 05:29:35.499399 7270 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 05:29:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:29:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tztwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4ds64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.900355 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1346874-95a4-4f6c-9655-b7122d808169\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T05:27:58Z\\\",\\\"message\\\":\\\"W0314 05:27:57.887542 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 05:27:57.888161 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773466077 cert, and key in /tmp/serving-cert-2978056429/serving-signer.crt, /tmp/serving-cert-2978056429/serving-signer.key\\\\nI0314 05:27:58.119693 1 observer_polling.go:159] Starting file observer\\\\nW0314 05:27:58.129910 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 05:27:58.130136 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 05:27:58.131092 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2978056429/tls.crt::/tmp/serving-cert-2978056429/tls.key\\\\\\\"\\\\nF0314 05:27:58.483953 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:27:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T05:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T05:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:26:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.918533 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5l5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"703b6542-1a83-442a-9673-6a774399dd7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:29:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T05:29:27Z\\\",\\\"message\\\":\\\"2026-03-14T05:28:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956\\\\n2026-03-14T05:28:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0e9008e5-fdff-4617-8c15-8413d250e956 to /host/opt/cni/bin/\\\\n2026-03-14T05:28:42Z [verbose] multus-daemon started\\\\n2026-03-14T05:28:42Z [verbose] Readiness Indicator file check\\\\n2026-03-14T05:29:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5l5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:07 crc kubenswrapper[4713]: I0314 05:30:07.929620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6cc7fbb-a88a-4b94-89bb-1323e0751467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T05:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7110edba67d2fb9a8f62a736e7c4a852f23ac4ed5e5aed34ece0a14fb6c1c4bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T05:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v24t7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T05:28:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ls8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T05:30:07Z is after 2025-08-24T17:21:41Z" Mar 14 05:30:08 crc kubenswrapper[4713]: I0314 05:30:08.563506 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:08 crc kubenswrapper[4713]: I0314 05:30:08.563627 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:08 crc kubenswrapper[4713]: E0314 05:30:08.563701 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:08 crc kubenswrapper[4713]: I0314 05:30:08.563793 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:08 crc kubenswrapper[4713]: E0314 05:30:08.563956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:08 crc kubenswrapper[4713]: E0314 05:30:08.564147 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:09 crc kubenswrapper[4713]: I0314 05:30:09.563746 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:09 crc kubenswrapper[4713]: E0314 05:30:09.564419 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:10 crc kubenswrapper[4713]: I0314 05:30:10.563400 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:10 crc kubenswrapper[4713]: I0314 05:30:10.563455 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:10 crc kubenswrapper[4713]: I0314 05:30:10.563455 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:10 crc kubenswrapper[4713]: E0314 05:30:10.563643 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:10 crc kubenswrapper[4713]: E0314 05:30:10.563785 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:10 crc kubenswrapper[4713]: E0314 05:30:10.563944 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:11 crc kubenswrapper[4713]: I0314 05:30:11.563684 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:11 crc kubenswrapper[4713]: E0314 05:30:11.563914 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:12 crc kubenswrapper[4713]: I0314 05:30:12.563652 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:12 crc kubenswrapper[4713]: I0314 05:30:12.563708 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:12 crc kubenswrapper[4713]: I0314 05:30:12.563835 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:12 crc kubenswrapper[4713]: E0314 05:30:12.563988 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:12 crc kubenswrapper[4713]: E0314 05:30:12.564139 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:12 crc kubenswrapper[4713]: E0314 05:30:12.564419 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:12 crc kubenswrapper[4713]: E0314 05:30:12.683002 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:13 crc kubenswrapper[4713]: I0314 05:30:13.562869 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:13 crc kubenswrapper[4713]: E0314 05:30:13.563164 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.059285 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/1.log" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.060026 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/0.log" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.060098 4713 generic.go:334] "Generic (PLEG): container finished" podID="703b6542-1a83-442a-9673-6a774399dd7e" containerID="430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8" exitCode=1 Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.060152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerDied","Data":"430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8"} Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.060313 4713 scope.go:117] "RemoveContainer" containerID="ef4b484789ef94bbe8381d717d59fecd8baef88561999285b41e59a228e22a78" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.061170 4713 scope.go:117] "RemoveContainer" containerID="430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8" Mar 14 05:30:14 crc kubenswrapper[4713]: E0314 05:30:14.061600 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5l5jq_openshift-multus(703b6542-1a83-442a-9673-6a774399dd7e)\"" pod="openshift-multus/multus-5l5jq" podUID="703b6542-1a83-442a-9673-6a774399dd7e" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.145267 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podStartSLOduration=128.145238885 podStartE2EDuration="2m8.145238885s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.118815296 +0000 UTC m=+197.206724636" watchObservedRunningTime="2026-03-14 05:30:14.145238885 +0000 UTC m=+197.233148185" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.193766 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.193738397 podStartE2EDuration="1m31.193738397s" podCreationTimestamp="2026-03-14 05:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.145904766 +0000 UTC m=+197.233814096" watchObservedRunningTime="2026-03-14 05:30:14.193738397 +0000 UTC m=+197.281647697" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.194139 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=26.19413131 podStartE2EDuration="26.19413131s" podCreationTimestamp="2026-03-14 05:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.193464289 +0000 UTC m=+197.281373589" watchObservedRunningTime="2026-03-14 05:30:14.19413131 +0000 UTC m=+197.282040620" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.306519 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h4rjf" podStartSLOduration=128.306495777 podStartE2EDuration="2m8.306495777s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.293424626 +0000 UTC m=+197.381333936" watchObservedRunningTime="2026-03-14 05:30:14.306495777 +0000 UTC m=+197.394405077" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.322684 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5lt5l" podStartSLOduration=128.322667745 podStartE2EDuration="2m8.322667745s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.306857279 +0000 UTC m=+197.394766599" watchObservedRunningTime="2026-03-14 05:30:14.322667745 +0000 UTC m=+197.410577045" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.349173 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.349139636 podStartE2EDuration="39.349139636s" podCreationTimestamp="2026-03-14 05:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.322937643 +0000 UTC m=+197.410846953" watchObservedRunningTime="2026-03-14 05:30:14.349139636 +0000 UTC m=+197.437048976" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.349679 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=36.349667452 podStartE2EDuration="36.349667452s" podCreationTimestamp="2026-03-14 05:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.348028741 +0000 UTC m=+197.435938081" watchObservedRunningTime="2026-03-14 05:30:14.349667452 +0000 UTC m=+197.437576802" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.404270 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sx769" podStartSLOduration=128.404200344 podStartE2EDuration="2m8.404200344s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.403904835 +0000 UTC m=+197.491814235" watchObservedRunningTime="2026-03-14 05:30:14.404200344 +0000 UTC m=+197.492109694" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.421327 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bnsh2" podStartSLOduration=127.421292641 podStartE2EDuration="2m7.421292641s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.421246799 +0000 UTC m=+197.509156129" watchObservedRunningTime="2026-03-14 05:30:14.421292641 +0000 UTC m=+197.509201991" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.458276 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=81.458243421 podStartE2EDuration="1m21.458243421s" podCreationTimestamp="2026-03-14 05:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:14.457286741 +0000 UTC m=+197.545196061" watchObservedRunningTime="2026-03-14 05:30:14.458243421 +0000 UTC m=+197.546152781" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.562743 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.562766 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.562847 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:14 crc kubenswrapper[4713]: E0314 05:30:14.562899 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:14 crc kubenswrapper[4713]: E0314 05:30:14.563004 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:14 crc kubenswrapper[4713]: E0314 05:30:14.563184 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:14 crc kubenswrapper[4713]: I0314 05:30:14.563895 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:30:14 crc kubenswrapper[4713]: E0314 05:30:14.564106 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4ds64_openshift-ovn-kubernetes(6632626e-d806-4de3-b20a-6ee10099a464)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" Mar 14 05:30:15 crc kubenswrapper[4713]: I0314 05:30:15.068296 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/1.log" Mar 14 05:30:15 crc kubenswrapper[4713]: I0314 05:30:15.563335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:15 crc kubenswrapper[4713]: E0314 05:30:15.563622 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:16 crc kubenswrapper[4713]: I0314 05:30:16.562973 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:16 crc kubenswrapper[4713]: I0314 05:30:16.563102 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:16 crc kubenswrapper[4713]: E0314 05:30:16.563187 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:16 crc kubenswrapper[4713]: E0314 05:30:16.563545 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:16 crc kubenswrapper[4713]: I0314 05:30:16.563841 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:16 crc kubenswrapper[4713]: E0314 05:30:16.563986 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.346768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.346864 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.346888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.346923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.346944 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T05:30:17Z","lastTransitionTime":"2026-03-14T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.419335 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm"] Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.420002 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.425143 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.425170 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.425789 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.426138 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.540107 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.540262 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c81260a2-5b2e-4e16-96c4-006eb230f7e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.540391 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c81260a2-5b2e-4e16-96c4-006eb230f7e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.540471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.540685 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c81260a2-5b2e-4e16-96c4-006eb230f7e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.563282 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:17 crc kubenswrapper[4713]: E0314 05:30:17.566179 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.615962 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.626654 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641559 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c81260a2-5b2e-4e16-96c4-006eb230f7e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641569 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641677 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c81260a2-5b2e-4e16-96c4-006eb230f7e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641730 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c81260a2-5b2e-4e16-96c4-006eb230f7e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.641767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c81260a2-5b2e-4e16-96c4-006eb230f7e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.643080 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c81260a2-5b2e-4e16-96c4-006eb230f7e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.651536 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c81260a2-5b2e-4e16-96c4-006eb230f7e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.670841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c81260a2-5b2e-4e16-96c4-006eb230f7e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wqzm\" (UID: \"c81260a2-5b2e-4e16-96c4-006eb230f7e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: E0314 05:30:17.683675 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:17 crc kubenswrapper[4713]: I0314 05:30:17.743342 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" Mar 14 05:30:17 crc kubenswrapper[4713]: W0314 05:30:17.764479 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81260a2_5b2e_4e16_96c4_006eb230f7e8.slice/crio-0e2c04bbde2df7d0c8527c8c21ff0cd9e45b0f5b3b641e8fba4953eb85d10707 WatchSource:0}: Error finding container 0e2c04bbde2df7d0c8527c8c21ff0cd9e45b0f5b3b641e8fba4953eb85d10707: Status 404 returned error can't find the container with id 0e2c04bbde2df7d0c8527c8c21ff0cd9e45b0f5b3b641e8fba4953eb85d10707 Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.084405 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" event={"ID":"c81260a2-5b2e-4e16-96c4-006eb230f7e8","Type":"ContainerStarted","Data":"11a1a8da431a29ec892ad13090ec4304c27b7a348833a620feabdf46e55dd601"} Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.084492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" event={"ID":"c81260a2-5b2e-4e16-96c4-006eb230f7e8","Type":"ContainerStarted","Data":"0e2c04bbde2df7d0c8527c8c21ff0cd9e45b0f5b3b641e8fba4953eb85d10707"} Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.111165 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wqzm" podStartSLOduration=132.111139227 podStartE2EDuration="2m12.111139227s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:18.110016622 +0000 UTC m=+201.197925922" watchObservedRunningTime="2026-03-14 05:30:18.111139227 +0000 UTC m=+201.199048527" Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.563486 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.563544 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:18 crc kubenswrapper[4713]: I0314 05:30:18.563562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:18 crc kubenswrapper[4713]: E0314 05:30:18.563678 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:18 crc kubenswrapper[4713]: E0314 05:30:18.563790 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:18 crc kubenswrapper[4713]: E0314 05:30:18.564084 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:19 crc kubenswrapper[4713]: I0314 05:30:19.562935 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:19 crc kubenswrapper[4713]: E0314 05:30:19.563131 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:20 crc kubenswrapper[4713]: I0314 05:30:20.562690 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:20 crc kubenswrapper[4713]: I0314 05:30:20.562726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:20 crc kubenswrapper[4713]: I0314 05:30:20.562704 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:20 crc kubenswrapper[4713]: E0314 05:30:20.562853 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:20 crc kubenswrapper[4713]: E0314 05:30:20.562996 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:20 crc kubenswrapper[4713]: E0314 05:30:20.563320 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:21 crc kubenswrapper[4713]: I0314 05:30:21.563281 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:21 crc kubenswrapper[4713]: E0314 05:30:21.564198 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:22 crc kubenswrapper[4713]: I0314 05:30:22.562661 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:22 crc kubenswrapper[4713]: I0314 05:30:22.562661 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:22 crc kubenswrapper[4713]: I0314 05:30:22.562673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:22 crc kubenswrapper[4713]: E0314 05:30:22.563080 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:22 crc kubenswrapper[4713]: E0314 05:30:22.562831 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:22 crc kubenswrapper[4713]: E0314 05:30:22.563145 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:22 crc kubenswrapper[4713]: E0314 05:30:22.685250 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:23 crc kubenswrapper[4713]: I0314 05:30:23.562844 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:23 crc kubenswrapper[4713]: E0314 05:30:23.563117 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:24 crc kubenswrapper[4713]: I0314 05:30:24.562922 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:24 crc kubenswrapper[4713]: I0314 05:30:24.563002 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:24 crc kubenswrapper[4713]: I0314 05:30:24.563043 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:24 crc kubenswrapper[4713]: E0314 05:30:24.563163 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:24 crc kubenswrapper[4713]: E0314 05:30:24.563364 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:24 crc kubenswrapper[4713]: E0314 05:30:24.563542 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:25 crc kubenswrapper[4713]: I0314 05:30:25.563339 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:25 crc kubenswrapper[4713]: E0314 05:30:25.563575 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:26 crc kubenswrapper[4713]: I0314 05:30:26.563531 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:26 crc kubenswrapper[4713]: I0314 05:30:26.563602 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:26 crc kubenswrapper[4713]: I0314 05:30:26.563741 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:26 crc kubenswrapper[4713]: E0314 05:30:26.563901 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:26 crc kubenswrapper[4713]: E0314 05:30:26.564035 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:26 crc kubenswrapper[4713]: E0314 05:30:26.564178 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:27 crc kubenswrapper[4713]: I0314 05:30:27.562729 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:27 crc kubenswrapper[4713]: E0314 05:30:27.564296 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:27 crc kubenswrapper[4713]: E0314 05:30:27.685813 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:30:28 crc kubenswrapper[4713]: I0314 05:30:28.563170 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:28 crc kubenswrapper[4713]: I0314 05:30:28.563439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:28 crc kubenswrapper[4713]: E0314 05:30:28.563450 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:28 crc kubenswrapper[4713]: I0314 05:30:28.563530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:28 crc kubenswrapper[4713]: I0314 05:30:28.563874 4713 scope.go:117] "RemoveContainer" containerID="430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8" Mar 14 05:30:28 crc kubenswrapper[4713]: I0314 05:30:28.564426 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:30:28 crc kubenswrapper[4713]: E0314 05:30:28.564586 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:28 crc kubenswrapper[4713]: E0314 05:30:28.564637 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.123672 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/1.log" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.123771 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerStarted","Data":"585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be"} Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.125635 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/3.log" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.131497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerStarted","Data":"6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848"} Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.132543 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.151595 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5l5jq" podStartSLOduration=143.151572212 podStartE2EDuration="2m23.151572212s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:29.151406297 +0000 UTC m=+212.239315607" watchObservedRunningTime="2026-03-14 05:30:29.151572212 +0000 UTC m=+212.239481512" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.563528 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:29 crc kubenswrapper[4713]: E0314 05:30:29.563961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.652252 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podStartSLOduration=143.652193377 podStartE2EDuration="2m23.652193377s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:29.198695811 +0000 UTC m=+212.286605121" watchObservedRunningTime="2026-03-14 05:30:29.652193377 +0000 UTC m=+212.740102697" Mar 14 05:30:29 crc kubenswrapper[4713]: I0314 05:30:29.653800 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2t6mv"] Mar 14 05:30:30 crc kubenswrapper[4713]: I0314 05:30:30.134513 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:30 crc kubenswrapper[4713]: E0314 05:30:30.134720 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:30 crc kubenswrapper[4713]: I0314 05:30:30.563499 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:30 crc kubenswrapper[4713]: I0314 05:30:30.563686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:30 crc kubenswrapper[4713]: E0314 05:30:30.563739 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:30 crc kubenswrapper[4713]: I0314 05:30:30.563537 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:30 crc kubenswrapper[4713]: E0314 05:30:30.563931 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:30 crc kubenswrapper[4713]: E0314 05:30:30.564444 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:31 crc kubenswrapper[4713]: I0314 05:30:31.563373 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:31 crc kubenswrapper[4713]: E0314 05:30:31.563600 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2t6mv" podUID="8f2a1689-2973-4684-88d0-4ac7edb9b1d3" Mar 14 05:30:32 crc kubenswrapper[4713]: I0314 05:30:32.563519 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:32 crc kubenswrapper[4713]: I0314 05:30:32.563590 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:32 crc kubenswrapper[4713]: I0314 05:30:32.563654 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:32 crc kubenswrapper[4713]: E0314 05:30:32.563735 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 05:30:32 crc kubenswrapper[4713]: E0314 05:30:32.564163 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 05:30:32 crc kubenswrapper[4713]: E0314 05:30:32.564307 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 05:30:33 crc kubenswrapper[4713]: I0314 05:30:33.563324 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:30:33 crc kubenswrapper[4713]: I0314 05:30:33.566879 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 05:30:33 crc kubenswrapper[4713]: I0314 05:30:33.567083 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.563337 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.563406 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.563337 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.566046 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.566182 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.567289 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 05:30:34 crc kubenswrapper[4713]: I0314 05:30:34.567415 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.018600 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.089779 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ph5z9"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.090297 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-24csf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.090355 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.091249 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.091540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.092464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.094495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.094951 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.100453 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.100868 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.100953 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.101236 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.101995 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-748rb"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102280 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102548 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102597 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102633 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102677 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102727 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.102893 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.103035 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.103169 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.103130 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.103949 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104198 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9rqn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104556 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.105014 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104628 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.105097 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104693 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104769 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104792 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104814 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.104843 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.106030 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.106329 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.106495 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.107170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.107408 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.107495 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.108369 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.108631 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.108902 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.109708 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.111628 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.111774 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.113194 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.113522 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.114589 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.115239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.116240 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.117052 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.117333 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-whskd"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.118255 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.121251 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ztqtl"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.122047 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.122960 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.123640 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.127865 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vmsm7"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.128102 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.142700 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdq27"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.144290 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.146076 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.166614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.169776 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5vl8b"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.170177 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.171793 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.171992 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.172276 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.173145 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.173227 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-592mg"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.173785 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.173956 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.173975 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174330 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174417 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174533 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174605 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174683 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174536 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174734 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174810 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174833 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174844 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174813 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174966 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174991 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175059 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175256 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175395 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175450 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175492 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175574 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175689 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175797 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175829 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.175821 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176018 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176129 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176149 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.174615 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176300 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176345 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176410 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176470 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.176755 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.177095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.179166 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.179700 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181170 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181604 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181685 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181761 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181842 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181915 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182090 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182135 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181459 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.181685 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182425 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182442 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.182553 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.183163 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.183368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.183682 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.184149 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.184405 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.184416 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.185002 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.196255 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.196458 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.197346 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.197537 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.197757 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.198014 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.198516 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.198876 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.199047 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.199525 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.199722 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.200838 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.218218 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.220260 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.224320 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.224629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.234984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gl9v\" (UniqueName: \"kubernetes.io/projected/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-kube-api-access-4gl9v\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235084 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-image-import-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235122 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgcld\" (UniqueName: \"kubernetes.io/projected/848993f1-7cd4-405c-8f05-74b0e0a79730-kube-api-access-xgcld\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235160 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235193 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235257 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-serving-cert\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235306 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235474 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235565 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-client\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.235665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236613 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4q9k\" (UniqueName: \"kubernetes.io/projected/fc9fcc69-c663-4474-b449-eee4c468cd4f-kube-api-access-s4q9k\") pod \"downloads-7954f5f757-whskd\" (UID: \"fc9fcc69-c663-4474-b449-eee4c468cd4f\") " pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236674 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-audit-policies\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236735 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236791 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvb9j\" (UniqueName: \"kubernetes.io/projected/d87e3f74-fd38-4b24-b489-cf054f0e8375-kube-api-access-vvb9j\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236819 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236898 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb6723a-f90c-46d8-a294-d9f916179353-serving-cert\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236927 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-client\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-config\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-service-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1401ff24-2230-4431-8b63-d4d0980daa0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237069 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwcs\" (UniqueName: \"kubernetes.io/projected/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-kube-api-access-bwwcs\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237095 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237154 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237178 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87e3f74-fd38-4b24-b489-cf054f0e8375-serving-cert\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237323 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237347 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1401ff24-2230-4431-8b63-d4d0980daa0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237378 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237435 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237461 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw476\" (UniqueName: \"kubernetes.io/projected/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-kube-api-access-sw476\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237494 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-auth-proxy-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237544 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gldf\" (UniqueName: \"kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237569 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237600 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-node-pullsecrets\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5zj\" (UniqueName: \"kubernetes.io/projected/1af06ce0-0d0f-4113-b4e3-32d82664688b-kube-api-access-6j5zj\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237674 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7kv\" (UniqueName: \"kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-serving-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237740 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-encryption-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99wp7\" (UniqueName: \"kubernetes.io/projected/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-kube-api-access-99wp7\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237860 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237891 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr4m6\" (UniqueName: \"kubernetes.io/projected/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-kube-api-access-lr4m6\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237934 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvnjn\" (UniqueName: \"kubernetes.io/projected/68986d18-3623-49a9-88c3-983c5acf2e09-kube-api-access-qvnjn\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.237996 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc9z\" (UniqueName: \"kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238021 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238067 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88xq\" (UniqueName: \"kubernetes.io/projected/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-kube-api-access-h88xq\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238095 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68986d18-3623-49a9-88c3-983c5acf2e09-audit-dir\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238139 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-serving-cert\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1af06ce0-0d0f-4113-b4e3-32d82664688b-machine-approver-tls\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238194 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44gjz\" (UniqueName: \"kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238231 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238253 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-images\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238299 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238319 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238345 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238368 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-config\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238393 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238414 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238439 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/848993f1-7cd4-405c-8f05-74b0e0a79730-metrics-tls\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238459 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-config\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238481 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gpd9\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-kube-api-access-5gpd9\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238523 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcz27\" (UniqueName: \"kubernetes.io/projected/8fb6723a-f90c-46d8-a294-d9f916179353-kube-api-access-vcz27\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238548 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit-dir\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238569 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-serving-cert\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238589 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-encryption-config\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238614 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.238635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-trusted-ca\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.236546 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.239571 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.240843 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.241528 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.242064 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.252018 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.254318 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.255079 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.255142 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-s276w"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.255967 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.261003 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.261411 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.265169 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.265946 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.266674 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.266949 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.267263 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.267714 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.267865 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l4d7n"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.268839 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.270372 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.278997 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ph5z9"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.279135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.279706 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.281171 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.281767 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.296577 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.305459 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.306218 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2qpzw"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.306686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.306950 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.307095 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.309464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.310050 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nwnrf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.314688 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.316614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.317460 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557770-dmq2m"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.318077 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.318704 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.319601 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.320591 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.329226 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.329421 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.329910 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-24csf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.337610 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342194 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1401ff24-2230-4431-8b63-d4d0980daa0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342278 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwcs\" (UniqueName: \"kubernetes.io/projected/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-kube-api-access-bwwcs\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342322 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342348 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87e3f74-fd38-4b24-b489-cf054f0e8375-serving-cert\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342447 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342470 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342494 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1401ff24-2230-4431-8b63-d4d0980daa0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342520 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342542 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342567 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw476\" (UniqueName: \"kubernetes.io/projected/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-kube-api-access-sw476\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342592 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-auth-proxy-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gldf\" (UniqueName: \"kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342700 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-node-pullsecrets\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342732 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5zj\" (UniqueName: \"kubernetes.io/projected/1af06ce0-0d0f-4113-b4e3-32d82664688b-kube-api-access-6j5zj\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342759 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-serving-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342785 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-encryption-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342813 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7kv\" (UniqueName: \"kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342874 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c75654-c34a-4a93-8be1-ab766574a3ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99wp7\" (UniqueName: \"kubernetes.io/projected/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-kube-api-access-99wp7\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.342978 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343005 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343032 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343056 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr4m6\" (UniqueName: \"kubernetes.io/projected/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-kube-api-access-lr4m6\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343079 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvnjn\" (UniqueName: \"kubernetes.io/projected/68986d18-3623-49a9-88c3-983c5acf2e09-kube-api-access-qvnjn\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc9z\" (UniqueName: \"kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88xq\" (UniqueName: \"kubernetes.io/projected/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-kube-api-access-h88xq\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68986d18-3623-49a9-88c3-983c5acf2e09-audit-dir\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343274 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-serving-cert\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343306 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1af06ce0-0d0f-4113-b4e3-32d82664688b-machine-approver-tls\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343330 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-images\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343352 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44gjz\" (UniqueName: \"kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343375 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343430 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343471 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343488 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-config\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-config\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343544 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/848993f1-7cd4-405c-8f05-74b0e0a79730-metrics-tls\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit-dir\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343592 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gpd9\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-kube-api-access-5gpd9\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcz27\" (UniqueName: \"kubernetes.io/projected/8fb6723a-f90c-46d8-a294-d9f916179353-kube-api-access-vcz27\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343630 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04c75654-c34a-4a93-8be1-ab766574a3ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-serving-cert\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343664 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-encryption-config\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-trusted-ca\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343716 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgcld\" (UniqueName: \"kubernetes.io/projected/848993f1-7cd4-405c-8f05-74b0e0a79730-kube-api-access-xgcld\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343735 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343770 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gl9v\" (UniqueName: \"kubernetes.io/projected/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-kube-api-access-4gl9v\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343785 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-image-import-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343803 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-serving-cert\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343817 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343832 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343850 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343883 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04c75654-c34a-4a93-8be1-ab766574a3ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343900 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343918 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-client\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343953 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343972 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4q9k\" (UniqueName: \"kubernetes.io/projected/fc9fcc69-c663-4474-b449-eee4c468cd4f-kube-api-access-s4q9k\") pod \"downloads-7954f5f757-whskd\" (UID: \"fc9fcc69-c663-4474-b449-eee4c468cd4f\") " pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.343990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344006 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-audit-policies\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344034 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvb9j\" (UniqueName: \"kubernetes.io/projected/d87e3f74-fd38-4b24-b489-cf054f0e8375-kube-api-access-vvb9j\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344094 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb6723a-f90c-46d8-a294-d9f916179353-serving-cert\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344150 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-client\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344165 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-config\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344180 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344196 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-service-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.344888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-service-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.346804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.347856 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d87e3f74-fd38-4b24-b489-cf054f0e8375-config\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.348731 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-images\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.349877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.349966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.350597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.350625 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.351081 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.351372 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d87e3f74-fd38-4b24-b489-cf054f0e8375-serving-cert\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.352185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.352717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.353648 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-auth-proxy-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.353754 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.354048 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/848993f1-7cd4-405c-8f05-74b0e0a79730-metrics-tls\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.354095 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit-dir\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.354190 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.354375 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1401ff24-2230-4431-8b63-d4d0980daa0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.354574 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.355626 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af06ce0-0d0f-4113-b4e3-32d82664688b-config\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.355729 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-node-pullsecrets\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.356171 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.356279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68986d18-3623-49a9-88c3-983c5acf2e09-audit-dir\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.356774 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-serving-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.356837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.357094 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.357296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-config\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.357355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.377747 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.378447 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1af06ce0-0d0f-4113-b4e3-32d82664688b-machine-approver-tls\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.378816 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.379426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-encryption-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.379708 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-encryption-config\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.380053 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-serving-cert\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.380271 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1401ff24-2230-4431-8b63-d4d0980daa0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.381047 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.381281 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.381406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.381444 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.381511 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.382277 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-image-import-ca\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.383091 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-serving-cert\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.388350 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.389416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.389425 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb6723a-f90c-46d8-a294-d9f916179353-trusted-ca\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.389945 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-config\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.390558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.390973 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.393488 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-audit\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.395023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.395604 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb6723a-f90c-46d8-a294-d9f916179353-serving-cert\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.395873 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68986d18-3623-49a9-88c3-983c5acf2e09-audit-policies\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.396056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.396518 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.396982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.398104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.398688 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.398767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.399423 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68986d18-3623-49a9-88c3-983c5acf2e09-etcd-client\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.400718 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.400852 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.400896 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-748rb"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.400916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-config\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.401549 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.401867 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-etcd-client\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.402163 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.402602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.403319 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.405794 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.412045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-serving-cert\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.416245 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.416367 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ztqtl"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.412045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.418532 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9rqn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.419720 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.420041 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.420349 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.421754 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.422484 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-whskd"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.423614 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nwnrf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.424941 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vmsm7"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.426189 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.427382 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.428759 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdq27"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.430576 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.431920 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.432189 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.433417 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.434550 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.436771 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.437670 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p46s4"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.438769 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-54p9f"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.438811 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.439876 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.440328 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.441188 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.442316 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.443689 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5vl8b"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.444642 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.444977 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04c75654-c34a-4a93-8be1-ab766574a3ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.445089 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c75654-c34a-4a93-8be1-ab766574a3ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.445249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04c75654-c34a-4a93-8be1-ab766574a3ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.445685 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.446701 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p46s4"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.447875 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.448862 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.449861 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557770-dmq2m"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.450940 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.451977 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.452969 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.453022 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2qpzw"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.454113 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.455136 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-592mg"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.456240 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l4d7n"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.457258 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-54p9f"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.458294 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5f26d"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.458987 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.459376 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf"] Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.472776 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.492875 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.512427 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.537496 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.552044 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.572053 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.593380 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.612914 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.633841 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.652478 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.673504 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.693169 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.732762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.752514 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.772656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.792271 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.812195 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.833583 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.840786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04c75654-c34a-4a93-8be1-ab766574a3ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.853043 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.874433 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.893367 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.896086 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04c75654-c34a-4a93-8be1-ab766574a3ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.913699 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.932854 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.952839 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.972692 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 05:30:38 crc kubenswrapper[4713]: I0314 05:30:38.994602 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.014549 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.034895 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.054184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.074324 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.093866 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.116070 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.134891 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.154597 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.174453 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.195532 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.214377 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.233694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.254526 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.271430 4713 request.go:700] Waited for 1.004262654s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.273676 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.293793 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.313267 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.333440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.355392 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.374148 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.401361 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.414497 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.433466 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.453906 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.494611 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.513033 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.534000 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.552892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.573399 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.594065 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.614666 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.633845 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.653601 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.673673 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.694402 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.713890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.733991 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.753867 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.773676 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.793229 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.814503 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.833920 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.853913 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.872737 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.894391 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.913312 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.933316 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.953558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 05:30:39 crc kubenswrapper[4713]: I0314 05:30:39.973879 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.016679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwcs\" (UniqueName: \"kubernetes.io/projected/da4bb664-a24b-4644-9b7a-a0c6eda2c66f-kube-api-access-bwwcs\") pod \"openshift-config-operator-7777fb866f-748rb\" (UID: \"da4bb664-a24b-4644-9b7a-a0c6eda2c66f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.030984 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44gjz\" (UniqueName: \"kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz\") pod \"oauth-openshift-558db77b4-v9rqn\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.046858 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.056120 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvnjn\" (UniqueName: \"kubernetes.io/projected/68986d18-3623-49a9-88c3-983c5acf2e09-kube-api-access-qvnjn\") pod \"apiserver-7bbb656c7d-vlkm6\" (UID: \"68986d18-3623-49a9-88c3-983c5acf2e09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.076444 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc9z\" (UniqueName: \"kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z\") pod \"route-controller-manager-6576b87f9c-8p2ln\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.095076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88xq\" (UniqueName: \"kubernetes.io/projected/d6d1d72a-09c8-47bc-ad1a-ad11843c9a30-kube-api-access-h88xq\") pod \"apiserver-76f77b778f-24csf\" (UID: \"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30\") " pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.118174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw476\" (UniqueName: \"kubernetes.io/projected/1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294-kube-api-access-sw476\") pod \"openshift-controller-manager-operator-756b6f6bc6-vtptb\" (UID: \"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.134730 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gldf\" (UniqueName: \"kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf\") pod \"console-f9d7485db-rp4kf\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.162572 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gpd9\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-kube-api-access-5gpd9\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.171725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcz27\" (UniqueName: \"kubernetes.io/projected/8fb6723a-f90c-46d8-a294-d9f916179353-kube-api-access-vcz27\") pod \"console-operator-58897d9998-cdq27\" (UID: \"8fb6723a-f90c-46d8-a294-d9f916179353\") " pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.282139 4713 request.go:700] Waited for 1.926222486s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.282497 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.283977 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.284147 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.303985 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.304168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4q9k\" (UniqueName: \"kubernetes.io/projected/fc9fcc69-c663-4474-b449-eee4c468cd4f-kube-api-access-s4q9k\") pod \"downloads-7954f5f757-whskd\" (UID: \"fc9fcc69-c663-4474-b449-eee4c468cd4f\") " pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.307623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5zj\" (UniqueName: \"kubernetes.io/projected/1af06ce0-0d0f-4113-b4e3-32d82664688b-kube-api-access-6j5zj\") pod \"machine-approver-56656f9798-cm8rk\" (UID: \"1af06ce0-0d0f-4113-b4e3-32d82664688b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.308458 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgcld\" (UniqueName: \"kubernetes.io/projected/848993f1-7cd4-405c-8f05-74b0e0a79730-kube-api-access-xgcld\") pod \"dns-operator-744455d44c-ph5z9\" (UID: \"848993f1-7cd4-405c-8f05-74b0e0a79730\") " pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.310874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1401ff24-2230-4431-8b63-d4d0980daa0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5df8p\" (UID: \"1401ff24-2230-4431-8b63-d4d0980daa0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.323422 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.340691 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7kv\" (UniqueName: \"kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv\") pod \"controller-manager-879f6c89f-84xqp\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.342647 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr4m6\" (UniqueName: \"kubernetes.io/projected/ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037-kube-api-access-lr4m6\") pod \"cluster-samples-operator-665b6dd947-556hp\" (UID: \"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.343326 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvb9j\" (UniqueName: \"kubernetes.io/projected/d87e3f74-fd38-4b24-b489-cf054f0e8375-kube-api-access-vvb9j\") pod \"authentication-operator-69f744f599-ztqtl\" (UID: \"d87e3f74-fd38-4b24-b489-cf054f0e8375\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.348696 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99wp7\" (UniqueName: \"kubernetes.io/projected/80e7a49a-8aa3-41d2-b1bf-74a0689fbee6-kube-api-access-99wp7\") pod \"machine-api-operator-5694c8668f-vmsm7\" (UID: \"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.353873 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.363109 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gl9v\" (UniqueName: \"kubernetes.io/projected/8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b-kube-api-access-4gl9v\") pod \"openshift-apiserver-operator-796bbdcf4f-24bhl\" (UID: \"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.377813 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.379841 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.384673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.394084 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.409393 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.414465 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.436870 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.437323 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.455811 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.475488 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.484692 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.511633 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04c75654-c34a-4a93-8be1-ab766574a3ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-crb7p\" (UID: \"04c75654-c34a-4a93-8be1-ab766574a3ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.513046 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.519745 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.521437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.526082 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.535916 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.536058 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.544816 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.554986 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.585507 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.599562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.692621 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696833 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696875 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ce3a08b-0d84-46f7-aee4-c633105b323b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eb31828-9e51-4c7d-bc14-8787b4eca812-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq5bx\" (UniqueName: \"kubernetes.io/projected/5b056811-6e63-410a-b961-29b5fe78025d-kube-api-access-cq5bx\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.696983 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-config\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697241 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b056811-6e63-410a-b961-29b5fe78025d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-client\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvxk\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-kube-api-access-2tvxk\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697433 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb31828-9e51-4c7d-bc14-8787b4eca812-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697467 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slc5q\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdnw\" (UniqueName: \"kubernetes.io/projected/1ce3a08b-0d84-46f7-aee4-c633105b323b-kube-api-access-9zdnw\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c631a2c-40ce-4b64-a211-4305d3ea15bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697533 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/347baa98-b887-46c5-b2e9-bb6809206b42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697550 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697576 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9gcv\" (UniqueName: \"kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-images\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c631a2c-40ce-4b64-a211-4305d3ea15bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-service-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697686 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-default-certificate\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.697748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1129b-c978-4cb1-a73f-e09786113590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698598 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/347baa98-b887-46c5-b2e9-bb6809206b42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b056811-6e63-410a-b961-29b5fe78025d-proxy-tls\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698663 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-serving-cert\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698732 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698757 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qf54\" (UniqueName: \"kubernetes.io/projected/b0c5d3d7-8d33-4eba-a572-3c702a05a6df-kube-api-access-7qf54\") pod \"migrator-59844c95c7-qddwj\" (UID: \"b0c5d3d7-8d33-4eba-a572-3c702a05a6df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698780 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb31828-9e51-4c7d-bc14-8787b4eca812-config\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65j5\" (UniqueName: \"kubernetes.io/projected/d626c9fa-84ff-40c0-ae90-c477a699591a-kube-api-access-w65j5\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698858 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698925 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c99344f-ed48-4193-9c8f-46c8f295ee0c-metrics-tls\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c99344f-ed48-4193-9c8f-46c8f295ee0c-trusted-ca\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.698971 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/347baa98-b887-46c5-b2e9-bb6809206b42-config\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699004 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699025 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-metrics-certs\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699093 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-srv-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699115 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkr25\" (UniqueName: \"kubernetes.io/projected/71586d21-5d3e-4ea9-840e-989af77915e8-kube-api-access-jkr25\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699169 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d626c9fa-84ff-40c0-ae90-c477a699591a-service-ca-bundle\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699190 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-stats-auth\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699227 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7gnr\" (UniqueName: \"kubernetes.io/projected/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-kube-api-access-h7gnr\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85f9\" (UniqueName: \"kubernetes.io/projected/7c631a2c-40ce-4b64-a211-4305d3ea15bb-kube-api-access-r85f9\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5hzl\" (UniqueName: \"kubernetes.io/projected/ed4a1500-6481-4d26-a107-f76299623688-kube-api-access-d5hzl\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.699324 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f29wv\" (UniqueName: \"kubernetes.io/projected/63d1129b-c978-4cb1-a73f-e09786113590-kube-api-access-f29wv\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: E0314 05:30:40.700117 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.200103283 +0000 UTC m=+224.288012583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.715040 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-24csf"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.735450 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.735513 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.749511 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9rqn"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.800290 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:40 crc kubenswrapper[4713]: E0314 05:30:40.800479 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.300448822 +0000 UTC m=+224.388358122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801060 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-registration-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801089 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzcnf\" (UniqueName: \"kubernetes.io/projected/5e96825d-646e-44d7-b980-c7efcd08d2f3-kube-api-access-mzcnf\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801117 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xms2d\" (UniqueName: \"kubernetes.io/projected/0dbd3bea-9644-4bc5-96c7-822b26810706-kube-api-access-xms2d\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801142 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96a65c75-d6b8-41db-8e74-263b186c7596-config-volume\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801184 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swwcn\" (UniqueName: \"kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn\") pod \"auto-csr-approver-29557770-dmq2m\" (UID: \"62a03b8e-3b89-41c5-9399-a6ae0d44a53c\") " pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c99344f-ed48-4193-9c8f-46c8f295ee0c-metrics-tls\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801251 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c99344f-ed48-4193-9c8f-46c8f295ee0c-trusted-ca\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801304 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/347baa98-b887-46c5-b2e9-bb6809206b42-config\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801329 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-cert\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zlm\" (UniqueName: \"kubernetes.io/projected/9d6db3a7-58c6-44ad-8bed-daf1086729ad-kube-api-access-m6zlm\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801404 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-metrics-certs\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801470 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-srv-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkr25\" (UniqueName: \"kubernetes.io/projected/71586d21-5d3e-4ea9-840e-989af77915e8-kube-api-access-jkr25\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d626c9fa-84ff-40c0-ae90-c477a699591a-service-ca-bundle\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801586 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-stats-auth\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7gnr\" (UniqueName: \"kubernetes.io/projected/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-kube-api-access-h7gnr\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-key\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r85f9\" (UniqueName: \"kubernetes.io/projected/7c631a2c-40ce-4b64-a211-4305d3ea15bb-kube-api-access-r85f9\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801703 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5hzl\" (UniqueName: \"kubernetes.io/projected/ed4a1500-6481-4d26-a107-f76299623688-kube-api-access-d5hzl\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801732 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-apiservice-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801769 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f29wv\" (UniqueName: \"kubernetes.io/projected/63d1129b-c978-4cb1-a73f-e09786113590-kube-api-access-f29wv\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801790 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96a65c75-d6b8-41db-8e74-263b186c7596-metrics-tls\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801852 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw8tq\" (UniqueName: \"kubernetes.io/projected/b00783a2-42c7-45b5-b83d-136c314b0086-kube-api-access-qw8tq\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801908 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ce3a08b-0d84-46f7-aee4-c633105b323b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801927 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801953 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eb31828-9e51-4c7d-bc14-8787b4eca812-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.801978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-csi-data-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq5bx\" (UniqueName: \"kubernetes.io/projected/5b056811-6e63-410a-b961-29b5fe78025d-kube-api-access-cq5bx\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802064 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkhk\" (UniqueName: \"kubernetes.io/projected/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-kube-api-access-fkkhk\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-config\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802111 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b056811-6e63-410a-b961-29b5fe78025d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802148 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-client\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.802197 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.803856 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvxk\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-kube-api-access-2tvxk\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809342 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809503 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb31828-9e51-4c7d-bc14-8787b4eca812-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809553 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdnw\" (UniqueName: \"kubernetes.io/projected/1ce3a08b-0d84-46f7-aee4-c633105b323b-kube-api-access-9zdnw\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809635 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809711 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slc5q\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809845 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c631a2c-40ce-4b64-a211-4305d3ea15bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809901 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.809920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/347baa98-b887-46c5-b2e9-bb6809206b42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810039 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810088 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhh7\" (UniqueName: \"kubernetes.io/projected/96a65c75-d6b8-41db-8e74-263b186c7596-kube-api-access-wnhh7\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-webhook-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810198 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gd52\" (UniqueName: \"kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810672 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.810879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ce3a08b-0d84-46f7-aee4-c633105b323b-proxy-tls\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811237 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b00783a2-42c7-45b5-b83d-136c314b0086-tmpfs\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6db3a7-58c6-44ad-8bed-daf1086729ad-serving-cert\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811324 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9gcv\" (UniqueName: \"kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811431 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-images\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811466 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c631a2c-40ce-4b64-a211-4305d3ea15bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-service-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811525 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.811558 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbd3bea-9644-4bc5-96c7-822b26810706-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.812508 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-config\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.812995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c99344f-ed48-4193-9c8f-46c8f295ee0c-trusted-ca\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.813118 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/347baa98-b887-46c5-b2e9-bb6809206b42-config\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.814065 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.814865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-images\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.815462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c631a2c-40ce-4b64-a211-4305d3ea15bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.815931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-service-ca\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.816027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-default-certificate\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.818719 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ce3a08b-0d84-46f7-aee4-c633105b323b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.819348 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: E0314 05:30:40.819406 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.319374686 +0000 UTC m=+224.407283986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.826394 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-cabundle\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.827011 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b056811-6e63-410a-b961-29b5fe78025d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.827200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.827279 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-plugins-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.827326 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1129b-c978-4cb1-a73f-e09786113590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.829494 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d626c9fa-84ff-40c0-ae90-c477a699591a-service-ca-bundle\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830026 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/347baa98-b887-46c5-b2e9-bb6809206b42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-mountpoint-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830362 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6sk\" (UniqueName: \"kubernetes.io/projected/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-kube-api-access-lx6sk\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830824 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830943 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b056811-6e63-410a-b961-29b5fe78025d-proxy-tls\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.830996 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntzs5\" (UniqueName: \"kubernetes.io/projected/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-kube-api-access-ntzs5\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.831224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-serving-cert\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.831335 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6db3a7-58c6-44ad-8bed-daf1086729ad-config\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.831630 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.831640 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.834731 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-socket-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.834899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-srv-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.834947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835140 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qf54\" (UniqueName: \"kubernetes.io/projected/b0c5d3d7-8d33-4eba-a572-3c702a05a6df-kube-api-access-7qf54\") pod \"migrator-59844c95c7-qddwj\" (UID: \"b0c5d3d7-8d33-4eba-a572-3c702a05a6df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835355 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-node-bootstrap-token\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835394 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb31828-9e51-4c7d-bc14-8787b4eca812-config\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835429 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjvmh\" (UniqueName: \"kubernetes.io/projected/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-kube-api-access-xjvmh\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65j5\" (UniqueName: \"kubernetes.io/projected/d626c9fa-84ff-40c0-ae90-c477a699591a-kube-api-access-w65j5\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.835564 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-certs\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.836345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb31828-9e51-4c7d-bc14-8787b4eca812-config\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.836853 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-etcd-client\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.839246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/347baa98-b887-46c5-b2e9-bb6809206b42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.840046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-stats-auth\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.840043 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eb31828-9e51-4c7d-bc14-8787b4eca812-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.840907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.845672 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.845770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-default-certificate\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.848000 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c631a2c-40ce-4b64-a211-4305d3ea15bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.849406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed4a1500-6481-4d26-a107-f76299623688-srv-cert\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.850030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d626c9fa-84ff-40c0-ae90-c477a699591a-metrics-certs\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.850276 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71586d21-5d3e-4ea9-840e-989af77915e8-serving-cert\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.850342 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f29wv\" (UniqueName: \"kubernetes.io/projected/63d1129b-c978-4cb1-a73f-e09786113590-kube-api-access-f29wv\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.850733 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.850898 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b056811-6e63-410a-b961-29b5fe78025d-proxy-tls\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.851083 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1129b-c978-4cb1-a73f-e09786113590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v9q67\" (UID: \"63d1129b-c978-4cb1-a73f-e09786113590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.852794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.852960 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8c99344f-ed48-4193-9c8f-46c8f295ee0c-metrics-tls\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.861403 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq5bx\" (UniqueName: \"kubernetes.io/projected/5b056811-6e63-410a-b961-29b5fe78025d-kube-api-access-cq5bx\") pod \"machine-config-controller-84d6567774-hvgrc\" (UID: \"5b056811-6e63-410a-b961-29b5fe78025d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: W0314 05:30:40.879252 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca75b92_342d_46ef_8307_92efc0200a55.slice/crio-69caafabd5ef72460d374a0f63ed84cd4b0313b00dc274f7a2181767fb4aa46a WatchSource:0}: Error finding container 69caafabd5ef72460d374a0f63ed84cd4b0313b00dc274f7a2181767fb4aa46a: Status 404 returned error can't find the container with id 69caafabd5ef72460d374a0f63ed84cd4b0313b00dc274f7a2181767fb4aa46a Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.890742 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85f9\" (UniqueName: \"kubernetes.io/projected/7c631a2c-40ce-4b64-a211-4305d3ea15bb-kube-api-access-r85f9\") pod \"kube-storage-version-migrator-operator-b67b599dd-8txgf\" (UID: \"7c631a2c-40ce-4b64-a211-4305d3ea15bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.893736 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.914718 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.920235 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.921059 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.928074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-748rb"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.932223 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eb31828-9e51-4c7d-bc14-8787b4eca812-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-swm74\" (UID: \"5eb31828-9e51-4c7d-bc14-8787b4eca812\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.936701 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.937047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-socket-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.937090 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-srv-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-node-bootstrap-token\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjvmh\" (UniqueName: \"kubernetes.io/projected/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-kube-api-access-xjvmh\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-certs\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-registration-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939634 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzcnf\" (UniqueName: \"kubernetes.io/projected/5e96825d-646e-44d7-b980-c7efcd08d2f3-kube-api-access-mzcnf\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xms2d\" (UniqueName: \"kubernetes.io/projected/0dbd3bea-9644-4bc5-96c7-822b26810706-kube-api-access-xms2d\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939729 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96a65c75-d6b8-41db-8e74-263b186c7596-config-volume\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939787 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swwcn\" (UniqueName: \"kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn\") pod \"auto-csr-approver-29557770-dmq2m\" (UID: \"62a03b8e-3b89-41c5-9399-a6ae0d44a53c\") " pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939822 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-cert\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.939870 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zlm\" (UniqueName: \"kubernetes.io/projected/9d6db3a7-58c6-44ad-8bed-daf1086729ad-kube-api-access-m6zlm\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.940052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.940075 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-key\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.941159 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-apiservice-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.941248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96a65c75-d6b8-41db-8e74-263b186c7596-metrics-tls\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: E0314 05:30:40.943357 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.443327307 +0000 UTC m=+224.531236607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.943423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw8tq\" (UniqueName: \"kubernetes.io/projected/b00783a2-42c7-45b5-b83d-136c314b0086-kube-api-access-qw8tq\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.943484 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-csi-data-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.944368 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkhk\" (UniqueName: \"kubernetes.io/projected/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-kube-api-access-fkkhk\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.944550 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhh7\" (UniqueName: \"kubernetes.io/projected/96a65c75-d6b8-41db-8e74-263b186c7596-kube-api-access-wnhh7\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945057 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-webhook-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945734 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gd52\" (UniqueName: \"kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b00783a2-42c7-45b5-b83d-136c314b0086-tmpfs\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945810 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6db3a7-58c6-44ad-8bed-daf1086729ad-serving-cert\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945885 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbd3bea-9644-4bc5-96c7-822b26810706-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.945974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-cabundle\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946011 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-plugins-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946098 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-mountpoint-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946268 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6sk\" (UniqueName: \"kubernetes.io/projected/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-kube-api-access-lx6sk\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946299 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntzs5\" (UniqueName: \"kubernetes.io/projected/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-kube-api-access-ntzs5\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946328 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6db3a7-58c6-44ad-8bed-daf1086729ad-config\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.946356 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.947104 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.949052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-cert\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.949790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-socket-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.950248 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-node-bootstrap-token\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: E0314 05:30:40.951790 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.451766132 +0000 UTC m=+224.539675432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.951859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-csi-data-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.952438 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-plugins-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.952468 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-mountpoint-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.953362 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6db3a7-58c6-44ad-8bed-daf1086729ad-config\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.953691 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-key\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.954251 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b00783a2-42c7-45b5-b83d-136c314b0086-tmpfs\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.954335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9gcv\" (UniqueName: \"kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv\") pod \"marketplace-operator-79b997595-mw4tj\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.955290 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.955542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-registration-dir\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.957282 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-signing-cabundle\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.959707 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.959942 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-apiservice-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.960239 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96a65c75-d6b8-41db-8e74-263b186c7596-config-volume\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.963027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-srv-cert\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.964186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.966837 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdq27"] Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.968296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e96825d-646e-44d7-b980-c7efcd08d2f3-certs\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.969110 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dbd3bea-9644-4bc5-96c7-822b26810706-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.970666 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96a65c75-d6b8-41db-8e74-263b186c7596-metrics-tls\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.974154 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b00783a2-42c7-45b5-b83d-136c314b0086-webhook-cert\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.977087 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d6db3a7-58c6-44ad-8bed-daf1086729ad-serving-cert\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.980267 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5hzl\" (UniqueName: \"kubernetes.io/projected/ed4a1500-6481-4d26-a107-f76299623688-kube-api-access-d5hzl\") pod \"catalog-operator-68c6474976-qkdqn\" (UID: \"ed4a1500-6481-4d26-a107-f76299623688\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:40 crc kubenswrapper[4713]: W0314 05:30:40.983048 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb6723a_f90c_46d8_a294_d9f916179353.slice/crio-b7f925692a54922d9dd054b28ecd7507f6216dae42d406f10bd83ee4dc03cd59 WatchSource:0}: Error finding container b7f925692a54922d9dd054b28ecd7507f6216dae42d406f10bd83ee4dc03cd59: Status 404 returned error can't find the container with id b7f925692a54922d9dd054b28ecd7507f6216dae42d406f10bd83ee4dc03cd59 Mar 14 05:30:40 crc kubenswrapper[4713]: I0314 05:30:40.998076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7gnr\" (UniqueName: \"kubernetes.io/projected/ebbfd4b5-734f-4e37-89af-c0f4f0904d94-kube-api-access-h7gnr\") pod \"multus-admission-controller-857f4d67dd-l4d7n\" (UID: \"ebbfd4b5-734f-4e37-89af-c0f4f0904d94\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.013746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvxk\" (UniqueName: \"kubernetes.io/projected/8c99344f-ed48-4193-9c8f-46c8f295ee0c-kube-api-access-2tvxk\") pod \"ingress-operator-5b745b69d9-592mg\" (UID: \"8c99344f-ed48-4193-9c8f-46c8f295ee0c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.038576 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.043889 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.045023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkr25\" (UniqueName: \"kubernetes.io/projected/71586d21-5d3e-4ea9-840e-989af77915e8-kube-api-access-jkr25\") pod \"etcd-operator-b45778765-5vl8b\" (UID: \"71586d21-5d3e-4ea9-840e-989af77915e8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.048424 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.048939 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.548919622 +0000 UTC m=+224.636828912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.054940 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.066266 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.073486 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.077311 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdnw\" (UniqueName: \"kubernetes.io/projected/1ce3a08b-0d84-46f7-aee4-c633105b323b-kube-api-access-9zdnw\") pod \"machine-config-operator-74547568cd-bvnfp\" (UID: \"1ce3a08b-0d84-46f7-aee4-c633105b323b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.088027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slc5q\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.102800 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1566f9e4_cb9e_4ed5_b2c3_ec9e0ba1a294.slice/crio-00d8da6df0e18ca3f1d34dd67cfd5474783497fbb5f941190c6f2938e487615d WatchSource:0}: Error finding container 00d8da6df0e18ca3f1d34dd67cfd5474783497fbb5f941190c6f2938e487615d: Status 404 returned error can't find the container with id 00d8da6df0e18ca3f1d34dd67cfd5474783497fbb5f941190c6f2938e487615d Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.108686 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96bf650f_2c46_40aa_b26b_5d8a6df529fd.slice/crio-1fd1cac48e0bebe17970877303621cd970416d7ca396606fed13d9bc2e39ad41 WatchSource:0}: Error finding container 1fd1cac48e0bebe17970877303621cd970416d7ca396606fed13d9bc2e39ad41: Status 404 returned error can't find the container with id 1fd1cac48e0bebe17970877303621cd970416d7ca396606fed13d9bc2e39ad41 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.117175 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/347baa98-b887-46c5-b2e9-bb6809206b42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vfvfc\" (UID: \"347baa98-b887-46c5-b2e9-bb6809206b42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.130618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65j5\" (UniqueName: \"kubernetes.io/projected/d626c9fa-84ff-40c0-ae90-c477a699591a-kube-api-access-w65j5\") pod \"router-default-5444994796-s276w\" (UID: \"d626c9fa-84ff-40c0-ae90-c477a699591a\") " pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.150058 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.150185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qf54\" (UniqueName: \"kubernetes.io/projected/b0c5d3d7-8d33-4eba-a572-3c702a05a6df-kube-api-access-7qf54\") pod \"migrator-59844c95c7-qddwj\" (UID: \"b0c5d3d7-8d33-4eba-a572-3c702a05a6df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.150553 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.650539502 +0000 UTC m=+224.738448802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.161804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.177276 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.185038 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.200613 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzcnf\" (UniqueName: \"kubernetes.io/projected/5e96825d-646e-44d7-b980-c7efcd08d2f3-kube-api-access-mzcnf\") pod \"machine-config-server-5f26d\" (UID: \"5e96825d-646e-44d7-b980-c7efcd08d2f3\") " pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.206828 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.222340 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjvmh\" (UniqueName: \"kubernetes.io/projected/f31c44db-4635-4d5d-8aa5-98be5a6fd0ec-kube-api-access-xjvmh\") pod \"olm-operator-6b444d44fb-rlhc9\" (UID: \"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.230008 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.232528 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ztqtl"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.233719 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkhk\" (UniqueName: \"kubernetes.io/projected/3fb76932-a1a4-4af2-bb2e-3f9b0e872e51-kube-api-access-fkkhk\") pod \"service-ca-9c57cc56f-2qpzw\" (UID: \"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51\") " pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.234710 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.235530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.245681 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.250625 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vmsm7"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.251005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.252139 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.253000 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.752984428 +0000 UTC m=+224.840893728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.254823 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swwcn\" (UniqueName: \"kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn\") pod \"auto-csr-approver-29557770-dmq2m\" (UID: \"62a03b8e-3b89-41c5-9399-a6ae0d44a53c\") " pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.257856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.258830 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.265079 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.271643 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-whskd"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.271856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.276844 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zlm\" (UniqueName: \"kubernetes.io/projected/9d6db3a7-58c6-44ad-8bed-daf1086729ad-kube-api-access-m6zlm\") pod \"service-ca-operator-777779d784-9lrmn\" (UID: \"9d6db3a7-58c6-44ad-8bed-daf1086729ad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.284018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.294813 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw8tq\" (UniqueName: \"kubernetes.io/projected/b00783a2-42c7-45b5-b83d-136c314b0086-kube-api-access-qw8tq\") pod \"packageserver-d55dfcdfc-wp5sf\" (UID: \"b00783a2-42c7-45b5-b83d-136c314b0086\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.295121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.308827 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.314501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ph5z9"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.315131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.317906 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" event={"ID":"1af06ce0-0d0f-4113-b4e3-32d82664688b","Type":"ContainerStarted","Data":"b014d7e1d6a68f35b480494cbb2e2a40da22175f36c34d9c756a163a2e382f5b"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.317954 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" event={"ID":"1af06ce0-0d0f-4113-b4e3-32d82664688b","Type":"ContainerStarted","Data":"e41217095474aa0dba30d1ec74bd79d96079e5129df615c597aa382a04477568"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.318995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhh7\" (UniqueName: \"kubernetes.io/projected/96a65c75-d6b8-41db-8e74-263b186c7596-kube-api-access-wnhh7\") pod \"dns-default-p46s4\" (UID: \"96a65c75-d6b8-41db-8e74-263b186c7596\") " pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.320933 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e7a49a_8aa3_41d2_b1bf_74a0689fbee6.slice/crio-d23c7e3e815ac0d996a537b7da32efebd1afcf494fbb283113c6e8d7eae4f177 WatchSource:0}: Error finding container d23c7e3e815ac0d996a537b7da32efebd1afcf494fbb283113c6e8d7eae4f177: Status 404 returned error can't find the container with id d23c7e3e815ac0d996a537b7da32efebd1afcf494fbb283113c6e8d7eae4f177 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.323606 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" event={"ID":"da4bb664-a24b-4644-9b7a-a0c6eda2c66f","Type":"ContainerStarted","Data":"7a34de7f9933d1a6131f5624aadd1c4e09a5fae8f63484956c1a759ad4e00094"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.328426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdq27" event={"ID":"8fb6723a-f90c-46d8-a294-d9f916179353","Type":"ContainerStarted","Data":"b7f925692a54922d9dd054b28ecd7507f6216dae42d406f10bd83ee4dc03cd59"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.329947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" event={"ID":"7ca75b92-342d-46ef-8307-92efc0200a55","Type":"ContainerStarted","Data":"69caafabd5ef72460d374a0f63ed84cd4b0313b00dc274f7a2181767fb4aa46a"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.334459 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf"] Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.338117 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9fcc69_c663_4474_b449_eee4c468cd4f.slice/crio-d81693accaf24ae0de11901ecbd9559d53f78e3acaab26cbbc9345f69245c300 WatchSource:0}: Error finding container d81693accaf24ae0de11901ecbd9559d53f78e3acaab26cbbc9345f69245c300: Status 404 returned error can't find the container with id d81693accaf24ae0de11901ecbd9559d53f78e3acaab26cbbc9345f69245c300 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.338940 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" event={"ID":"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294","Type":"ContainerStarted","Data":"00d8da6df0e18ca3f1d34dd67cfd5474783497fbb5f941190c6f2938e487615d"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.339040 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.346236 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" event={"ID":"352eafb9-871b-4540-9083-d5a6c0340453","Type":"ContainerStarted","Data":"6fb636c45382273dc52e75f9b6dc6d12f236760c3509bdb0c8611a2e1f868f15"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.346278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" event={"ID":"352eafb9-871b-4540-9083-d5a6c0340453","Type":"ContainerStarted","Data":"98588770d435e057ed4ca5ab2eb5a7d4c2c221f6f68de5db06eb417fc1579ca6"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.349313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6sk\" (UniqueName: \"kubernetes.io/projected/3f6d9ce2-7015-482b-8249-c1e1dfb09be3-kube-api-access-lx6sk\") pod \"csi-hostpathplugin-nwnrf\" (UID: \"3f6d9ce2-7015-482b-8249-c1e1dfb09be3\") " pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.349332 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.350454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntzs5\" (UniqueName: \"kubernetes.io/projected/a1866f7b-9063-44ab-86c3-5f0f38d9a2e5-kube-api-access-ntzs5\") pod \"ingress-canary-54p9f\" (UID: \"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5\") " pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.351427 4713 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8p2ln container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.351804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" event={"ID":"68986d18-3623-49a9-88c3-983c5acf2e09","Type":"ContainerStarted","Data":"14021fa7e3b442eaa483045d84604a7e7d3cf398eee229faab784aabb859e473"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.351875 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" podUID="352eafb9-871b-4540-9083-d5a6c0340453" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.354246 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.354698 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.85468281 +0000 UTC m=+224.942592110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.361577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rp4kf" event={"ID":"96bf650f-2c46-40aa-b26b-5d8a6df529fd","Type":"ContainerStarted","Data":"1fd1cac48e0bebe17970877303621cd970416d7ca396606fed13d9bc2e39ad41"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.364334 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.367544 4713 generic.go:334] "Generic (PLEG): container finished" podID="d6d1d72a-09c8-47bc-ad1a-ad11843c9a30" containerID="0444d8f05f9badfe9c110d840e70d4c2a23c310fe7c90ac1945525834ddd9061" exitCode=0 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.367643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-24csf" event={"ID":"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30","Type":"ContainerDied","Data":"0444d8f05f9badfe9c110d840e70d4c2a23c310fe7c90ac1945525834ddd9061"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.367683 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-24csf" event={"ID":"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30","Type":"ContainerStarted","Data":"6f5749024c5a21fe279d3e5109a9a0c513a9cfb978d69443c5794f21fc6ce6b4"} Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.371377 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c75654_c34a_4a93_8be1_ab766574a3ed.slice/crio-5b6bfc2871456031db13dc7219c88e12c6806c4692496fd62bbc220c15b82d96 WatchSource:0}: Error finding container 5b6bfc2871456031db13dc7219c88e12c6806c4692496fd62bbc220c15b82d96: Status 404 returned error can't find the container with id 5b6bfc2871456031db13dc7219c88e12c6806c4692496fd62bbc220c15b82d96 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.376357 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.376700 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" event={"ID":"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b","Type":"ContainerStarted","Data":"32e6646bfa72e19f6010fa61e8d4ab54bdd5b9b0828edbba1d71e08450205a88"} Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.384673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5f26d" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.389952 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-54p9f" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.396196 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xms2d\" (UniqueName: \"kubernetes.io/projected/0dbd3bea-9644-4bc5-96c7-822b26810706-kube-api-access-xms2d\") pod \"package-server-manager-789f6589d5-w65zj\" (UID: \"0dbd3bea-9644-4bc5-96c7-822b26810706\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.397630 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gd52\" (UniqueName: \"kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52\") pod \"collect-profiles-29557770-vln29\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.408988 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.456001 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.456792 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:41.956776525 +0000 UTC m=+225.044685825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.558300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.558700 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.058681074 +0000 UTC m=+225.146590374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.586130 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.601185 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-l4d7n"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.631165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.648473 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.660128 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.661159 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.661425 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.161402619 +0000 UTC m=+225.249311919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.661460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.661948 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.161937285 +0000 UTC m=+225.249846585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.669926 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.709594 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2qpzw"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.762817 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.763308 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.263275256 +0000 UTC m=+225.351184556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.864803 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.865536 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.365514016 +0000 UTC m=+225.453423316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: W0314 05:30:41.866736 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd626c9fa_84ff_40c0_ae90_c477a699591a.slice/crio-5544bfff23da9bc42d06183ae5620fd86f3a7af53663febe8ff29cd90db7b919 WatchSource:0}: Error finding container 5544bfff23da9bc42d06183ae5620fd86f3a7af53663febe8ff29cd90db7b919: Status 404 returned error can't find the container with id 5544bfff23da9bc42d06183ae5620fd86f3a7af53663febe8ff29cd90db7b919 Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.950926 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn"] Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.966007 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.966316 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.466287379 +0000 UTC m=+225.554196719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:41 crc kubenswrapper[4713]: I0314 05:30:41.966477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:41 crc kubenswrapper[4713]: E0314 05:30:41.966836 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.466828526 +0000 UTC m=+225.554737826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.073014 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.073570 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.573526285 +0000 UTC m=+225.661435585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.082428 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.090417 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.590393934 +0000 UTC m=+225.678303244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.145978 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.147072 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557770-dmq2m"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.154654 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" podStartSLOduration=155.154637051 podStartE2EDuration="2m35.154637051s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:42.152752232 +0000 UTC m=+225.240661532" watchObservedRunningTime="2026-03-14 05:30:42.154637051 +0000 UTC m=+225.242546351" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.184193 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.184363 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.684336394 +0000 UTC m=+225.772245684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.184800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.185113 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.685106588 +0000 UTC m=+225.773015888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: W0314 05:30:42.197490 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod347baa98_b887_46c5_b2e9_bb6809206b42.slice/crio-3de6210dc9d5c8d63cd63b744ae7cc4bb17e69fe7cdf3aa382740a8666810a47 WatchSource:0}: Error finding container 3de6210dc9d5c8d63cd63b744ae7cc4bb17e69fe7cdf3aa382740a8666810a47: Status 404 returned error can't find the container with id 3de6210dc9d5c8d63cd63b744ae7cc4bb17e69fe7cdf3aa382740a8666810a47 Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.257979 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.258739 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-592mg"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.284716 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.287612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.287950 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.787934446 +0000 UTC m=+225.875843746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.330358 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5vl8b"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.339602 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.345598 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-54p9f"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.374729 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.378389 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.385448 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" event={"ID":"7c631a2c-40ce-4b64-a211-4305d3ea15bb","Type":"ContainerStarted","Data":"ac9ce6b5a6bb874b8e234aabb7691b818a1164ef2c4722347630f1f2f2e6a488"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.385487 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" event={"ID":"7c631a2c-40ce-4b64-a211-4305d3ea15bb","Type":"ContainerStarted","Data":"262e44628eb44d791981f5fd724064af4c3cded851d4cc8da6626adcb73d22d5"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.387043 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-whskd" event={"ID":"fc9fcc69-c663-4474-b449-eee4c468cd4f","Type":"ContainerStarted","Data":"af68f0191b2786bd3658580500428ccb817910fe006476043fbd0556ccc824cd"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.387067 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-whskd" event={"ID":"fc9fcc69-c663-4474-b449-eee4c468cd4f","Type":"ContainerStarted","Data":"d81693accaf24ae0de11901ecbd9559d53f78e3acaab26cbbc9345f69245c300"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.387653 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.390042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.390386 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.890374741 +0000 UTC m=+225.978284031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.390839 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-whskd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.390917 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-whskd" podUID="fc9fcc69-c663-4474-b449-eee4c468cd4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.392375 4713 generic.go:334] "Generic (PLEG): container finished" podID="68986d18-3623-49a9-88c3-983c5acf2e09" containerID="0b0c1687dfc72c34bea24bfef60ca3009f849c84e9960f7174b8d7dd3c1d9b78" exitCode=0 Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.392431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" event={"ID":"68986d18-3623-49a9-88c3-983c5acf2e09","Type":"ContainerDied","Data":"0b0c1687dfc72c34bea24bfef60ca3009f849c84e9960f7174b8d7dd3c1d9b78"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.404490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" event={"ID":"ed4a1500-6481-4d26-a107-f76299623688","Type":"ContainerStarted","Data":"f387a1199c75316bc2c17038e560835c726959d3051c4b91d5e0d09e0466e60c"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.423254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" event={"ID":"7ca75b92-342d-46ef-8307-92efc0200a55","Type":"ContainerStarted","Data":"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.423415 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.425477 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" event={"ID":"347baa98-b887-46c5-b2e9-bb6809206b42","Type":"ContainerStarted","Data":"3de6210dc9d5c8d63cd63b744ae7cc4bb17e69fe7cdf3aa382740a8666810a47"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.430504 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" event={"ID":"ebbfd4b5-734f-4e37-89af-c0f4f0904d94","Type":"ContainerStarted","Data":"ce14193bac1627e404ac5340c786a780ca30aee13f079740b13b3e3ee683c1ee"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.434908 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v9rqn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.434949 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.471765 4713 generic.go:334] "Generic (PLEG): container finished" podID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerID="b48891c96ba46740fca00c5aa0e6c71fa2e67f108ec53d706b02023fc3ac0a95" exitCode=0 Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.473143 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" event={"ID":"da4bb664-a24b-4644-9b7a-a0c6eda2c66f","Type":"ContainerDied","Data":"b48891c96ba46740fca00c5aa0e6c71fa2e67f108ec53d706b02023fc3ac0a95"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.485113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" event={"ID":"62a03b8e-3b89-41c5-9399-a6ae0d44a53c","Type":"ContainerStarted","Data":"6b8eac694448033f5042d86460f3120dafd3508761f8a39ebb9f19ca932e5e66"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.486434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" event={"ID":"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec","Type":"ContainerStarted","Data":"afcbe2fe1ecbd4ad35fd3ab86d1d0a9d418d0d1da55b2c394f0ccbc452694e33"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.488280 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" event={"ID":"6491e577-c2d1-4c4b-b1d0-e82b34eec943","Type":"ContainerStarted","Data":"664a55ffcd38c8805437725d6089ec3d4bf553b5974497a43a2e6295e9c819e4"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.488302 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" event={"ID":"6491e577-c2d1-4c4b-b1d0-e82b34eec943","Type":"ContainerStarted","Data":"97bfc55a19ae2cd7f1b70ad1f6f12f8ef88ee5f3baf82af967dc20f526264bf8"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.489346 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.490525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.491515 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:42.991499336 +0000 UTC m=+226.079408636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.512433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" event={"ID":"5b056811-6e63-410a-b961-29b5fe78025d","Type":"ContainerStarted","Data":"10e9203541e3de0ae548cca815f2fb11090aab7223c3f1f0fbf833a74031a5aa"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.518928 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" event={"ID":"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51","Type":"ContainerStarted","Data":"4cc99d24dd82c82a5016feb113133a47b222a09c174af7fab0387760111bdb9c"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.519056 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-84xqp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.519092 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.564939 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" event={"ID":"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6","Type":"ContainerStarted","Data":"58fd1fe6cbbe0c0b168329da3b7205d2d58d47c06bb85b141dbce23d63eefea5"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.564989 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" event={"ID":"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6","Type":"ContainerStarted","Data":"d23c7e3e815ac0d996a537b7da32efebd1afcf494fbb283113c6e8d7eae4f177"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.576282 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" event={"ID":"1401ff24-2230-4431-8b63-d4d0980daa0c","Type":"ContainerStarted","Data":"55651758ee01e1cfed04e958b7e6227dbebcc5671ce7e44a70eb74b98ce9206f"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.576335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" event={"ID":"1401ff24-2230-4431-8b63-d4d0980daa0c","Type":"ContainerStarted","Data":"9b08f3a1009fdfc3c4b538133ee2cf48ee9b4a6b731d53b9df1bd0386ea59ce3"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.592675 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.595674 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.095651355 +0000 UTC m=+226.183560655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.595760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" event={"ID":"8a9d1407-ed7b-4d70-bffa-5a2c5cb1b91b","Type":"ContainerStarted","Data":"17b4ad4f6c6e8c5cdbe5c7aeea69c9f8e321324316b5f6e21cf455a19937ec25"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.600339 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rp4kf" event={"ID":"96bf650f-2c46-40aa-b26b-5d8a6df529fd","Type":"ContainerStarted","Data":"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.611441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s276w" event={"ID":"d626c9fa-84ff-40c0-ae90-c477a699591a","Type":"ContainerStarted","Data":"5544bfff23da9bc42d06183ae5620fd86f3a7af53663febe8ff29cd90db7b919"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.623850 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" event={"ID":"63d1129b-c978-4cb1-a73f-e09786113590","Type":"ContainerStarted","Data":"fce5c8da56e70534571d7e80edd2d9f684b45eb9d40fc840f05421a001d16c82"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.637187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" event={"ID":"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037","Type":"ContainerStarted","Data":"7ebc6beb154f7a193fef93b4f9c447209300fc1e22f7ba205336a42f9c92c122"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.637256 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" event={"ID":"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037","Type":"ContainerStarted","Data":"2e38694e305256d41106e3ebb300c1dfae9c3a83e4a1a968867f1cb17911dfa6"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.638993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" event={"ID":"04c75654-c34a-4a93-8be1-ab766574a3ed","Type":"ContainerStarted","Data":"5b6bfc2871456031db13dc7219c88e12c6806c4692496fd62bbc220c15b82d96"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.648596 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" event={"ID":"d87e3f74-fd38-4b24-b489-cf054f0e8375","Type":"ContainerStarted","Data":"ae3aaccb64b0a9499cefcdef3d019f735315ce3ff3328d2119472a1b7a17f13f"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.648665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" event={"ID":"d87e3f74-fd38-4b24-b489-cf054f0e8375","Type":"ContainerStarted","Data":"ee15fff7d3e7272ef9963fb16658c1d47d83b8740984e89819f3d7bee72856b2"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.656141 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" event={"ID":"1566f9e4-cb9e-4ed5-b2c3-ec9e0ba1a294","Type":"ContainerStarted","Data":"af1bdd807eeaab01cf435ae51009a1f9fd19494ce1c360380e64e5b4f7a69022"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.694096 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.697093 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.197040468 +0000 UTC m=+226.284949768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.743044 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" event={"ID":"848993f1-7cd4-405c-8f05-74b0e0a79730","Type":"ContainerStarted","Data":"b1d1c54a8c45268d990f227b6795082b3833cc66d62e28b7ab50230629cf6162"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.746498 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5f26d" event={"ID":"5e96825d-646e-44d7-b980-c7efcd08d2f3","Type":"ContainerStarted","Data":"feebffe6e41bf23640688cc323af21a6c714fa1ad8938abfb93e21b1796a4222"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.751993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdq27" event={"ID":"8fb6723a-f90c-46d8-a294-d9f916179353","Type":"ContainerStarted","Data":"a9aac735cfd72f38657e56780f2154074363efe66bccfa6c94322f23718cfc69"} Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.766613 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.777938 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.799083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.799651 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.299631848 +0000 UTC m=+226.387541248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.837241 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.903242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:42 crc kubenswrapper[4713]: E0314 05:30:42.905780 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.405750959 +0000 UTC m=+226.493660259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.910253 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.920263 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29"] Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.945760 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" podStartSLOduration=156.945734335 podStartE2EDuration="2m36.945734335s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:42.945222398 +0000 UTC m=+226.033131708" watchObservedRunningTime="2026-03-14 05:30:42.945734335 +0000 UTC m=+226.033643635" Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.957664 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf"] Mar 14 05:30:42 crc kubenswrapper[4713]: W0314 05:30:42.976537 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2608a13_9ee1_45ed_926b_329192ef4d34.slice/crio-c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3 WatchSource:0}: Error finding container c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3: Status 404 returned error can't find the container with id c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3 Mar 14 05:30:42 crc kubenswrapper[4713]: I0314 05:30:42.994843 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-whskd" podStartSLOduration=156.994814765 podStartE2EDuration="2m36.994814765s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:42.986521574 +0000 UTC m=+226.074430874" watchObservedRunningTime="2026-03-14 05:30:42.994814765 +0000 UTC m=+226.082724065" Mar 14 05:30:43 crc kubenswrapper[4713]: W0314 05:30:43.006540 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00783a2_42c7_45b5_b83d_136c314b0086.slice/crio-79c2b233c8e15b7375e804511f3d44570505b66025929306a1e8ced8dbe6f276 WatchSource:0}: Error finding container 79c2b233c8e15b7375e804511f3d44570505b66025929306a1e8ced8dbe6f276: Status 404 returned error can't find the container with id 79c2b233c8e15b7375e804511f3d44570505b66025929306a1e8ced8dbe6f276 Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.007394 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.009997 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.509981281 +0000 UTC m=+226.597890581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.076482 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nwnrf"] Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.077239 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p46s4"] Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.077428 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5df8p" podStartSLOduration=157.077402448 podStartE2EDuration="2m37.077402448s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.066583378 +0000 UTC m=+226.154492678" watchObservedRunningTime="2026-03-14 05:30:43.077402448 +0000 UTC m=+226.165311748" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.109463 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.110527 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.610504397 +0000 UTC m=+226.698413697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.110791 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rp4kf" podStartSLOduration=157.110751154 podStartE2EDuration="2m37.110751154s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.103222928 +0000 UTC m=+226.191132238" watchObservedRunningTime="2026-03-14 05:30:43.110751154 +0000 UTC m=+226.198660454" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.171593 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" podStartSLOduration=156.171560693 podStartE2EDuration="2m36.171560693s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.169640293 +0000 UTC m=+226.257549583" watchObservedRunningTime="2026-03-14 05:30:43.171560693 +0000 UTC m=+226.259469993" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.195779 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vtptb" podStartSLOduration=157.195757873 podStartE2EDuration="2m37.195757873s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.194776712 +0000 UTC m=+226.282686012" watchObservedRunningTime="2026-03-14 05:30:43.195757873 +0000 UTC m=+226.283667173" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.213120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.213531 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.71351707 +0000 UTC m=+226.801426370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.240449 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" podStartSLOduration=157.240413745 podStartE2EDuration="2m37.240413745s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.23197878 +0000 UTC m=+226.319888080" watchObservedRunningTime="2026-03-14 05:30:43.240413745 +0000 UTC m=+226.328323045" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.270089 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" podStartSLOduration=157.270067835 podStartE2EDuration="2m37.270067835s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.268723974 +0000 UTC m=+226.356633274" watchObservedRunningTime="2026-03-14 05:30:43.270067835 +0000 UTC m=+226.357977135" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.294879 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.310082 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podStartSLOduration=157.310061631 podStartE2EDuration="2m37.310061631s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.307355966 +0000 UTC m=+226.395265266" watchObservedRunningTime="2026-03-14 05:30:43.310061631 +0000 UTC m=+226.397970931" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.314958 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.315549 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.815511502 +0000 UTC m=+226.903420812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.394950 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8txgf" podStartSLOduration=156.394898744 podStartE2EDuration="2m36.394898744s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.351863413 +0000 UTC m=+226.439772733" watchObservedRunningTime="2026-03-14 05:30:43.394898744 +0000 UTC m=+226.482808044" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.418467 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.418863 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:43.918846306 +0000 UTC m=+227.006755596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.470506 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24bhl" podStartSLOduration=157.470473736 podStartE2EDuration="2m37.470473736s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.469689812 +0000 UTC m=+226.557599112" watchObservedRunningTime="2026-03-14 05:30:43.470473736 +0000 UTC m=+226.558383036" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.521232 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.521671 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.021654663 +0000 UTC m=+227.109563963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.623287 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.623640 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.123626384 +0000 UTC m=+227.211535684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.724384 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.724613 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.224571353 +0000 UTC m=+227.312480653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.725011 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.725435 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.225417609 +0000 UTC m=+227.313326979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.823390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" event={"ID":"8c99344f-ed48-4193-9c8f-46c8f295ee0c","Type":"ContainerStarted","Data":"53cf48dcd0ebc13b8890521d46ba64a7c36baeb92b041eb489c30cee59a33736"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.823447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" event={"ID":"8c99344f-ed48-4193-9c8f-46c8f295ee0c","Type":"ContainerStarted","Data":"529fd2f3655c0f5d41bed2198a22b9c469d5908d28d7907b47528b15dde6465c"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.826838 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.827069 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" event={"ID":"0dbd3bea-9644-4bc5-96c7-822b26810706","Type":"ContainerStarted","Data":"7ce0a082a4f4bbdc9d1a2cbedbddf234270e726f11fdddcb425c62a9d7b0cfd4"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.827555 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" event={"ID":"0dbd3bea-9644-4bc5-96c7-822b26810706","Type":"ContainerStarted","Data":"368e2410809e25e79e2883184fed157c18e64582618f1035023aa2b99de5855c"} Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.827245 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.327199224 +0000 UTC m=+227.415108524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.827606 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.828127 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.328103803 +0000 UTC m=+227.416013123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.830954 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" event={"ID":"b0c5d3d7-8d33-4eba-a572-3c702a05a6df","Type":"ContainerStarted","Data":"c56c060be66cbd49b233080d76bddcbae4513597218991041786596baa61c13f"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.830994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" event={"ID":"b0c5d3d7-8d33-4eba-a572-3c702a05a6df","Type":"ContainerStarted","Data":"7afb7f022c55dcc33b4d78f5b367437d3d56959dae3376f9f9f8f90dbdfd02d9"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.833853 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" event={"ID":"3fb76932-a1a4-4af2-bb2e-3f9b0e872e51","Type":"ContainerStarted","Data":"5beb3507167ee0aaccb4efe7917b306b79500c9f877304b5605c91848cf73d66"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.836757 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" event={"ID":"3f6d9ce2-7015-482b-8249-c1e1dfb09be3","Type":"ContainerStarted","Data":"c5aa06b4f7295cc97d3ef7aff5d8107612494f395d0d44cc6a8c0d9205828eb7"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.842787 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" event={"ID":"b00783a2-42c7-45b5-b83d-136c314b0086","Type":"ContainerStarted","Data":"79c2b233c8e15b7375e804511f3d44570505b66025929306a1e8ced8dbe6f276"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.893172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" event={"ID":"1af06ce0-0d0f-4113-b4e3-32d82664688b","Type":"ContainerStarted","Data":"a4292300ec7a47f47fd23c7c6bf5f64cb81def8d3f74ebb44e6c1820da4bdf8e"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.899870 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v9q67" event={"ID":"63d1129b-c978-4cb1-a73f-e09786113590","Type":"ContainerStarted","Data":"4c5dfc437a210771da531113c4d03962c25793f44b7b95007d49004d5615d198"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.906376 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" event={"ID":"d2608a13-9ee1-45ed-926b-329192ef4d34","Type":"ContainerStarted","Data":"c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.908784 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2qpzw" podStartSLOduration=156.908763655 podStartE2EDuration="2m36.908763655s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.85989129 +0000 UTC m=+226.947800590" watchObservedRunningTime="2026-03-14 05:30:43.908763655 +0000 UTC m=+226.996672955" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.910292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" event={"ID":"04c75654-c34a-4a93-8be1-ab766574a3ed","Type":"ContainerStarted","Data":"cfc4ab36390c6bc44ef8f48acfd06fbe41d72a6e5a5678d861b0b8314ffecc99"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.926279 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" event={"ID":"1ce3a08b-0d84-46f7-aee4-c633105b323b","Type":"ContainerStarted","Data":"4baf6710e10c252f9bbfe2619709d404a84421997a559685ac9e8ae8f5805f8f"} Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.929389 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:43 crc kubenswrapper[4713]: E0314 05:30:43.933367 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.433331176 +0000 UTC m=+227.521240626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.938854 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-crb7p" podStartSLOduration=156.938818448 podStartE2EDuration="2m36.938818448s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.933706458 +0000 UTC m=+227.021615758" watchObservedRunningTime="2026-03-14 05:30:43.938818448 +0000 UTC m=+227.026727768" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.941779 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cm8rk" podStartSLOduration=157.94175942 podStartE2EDuration="2m37.94175942s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:43.907611388 +0000 UTC m=+226.995520688" watchObservedRunningTime="2026-03-14 05:30:43.94175942 +0000 UTC m=+227.029668720" Mar 14 05:30:43 crc kubenswrapper[4713]: I0314 05:30:43.992444 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-s276w" event={"ID":"d626c9fa-84ff-40c0-ae90-c477a699591a","Type":"ContainerStarted","Data":"83e58aa515268ea64a276a1d2ac547a95829496d2e01dd2ac1585914943c4030"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.005632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" event={"ID":"347baa98-b887-46c5-b2e9-bb6809206b42","Type":"ContainerStarted","Data":"3c43e1137c35cc8c3aa5e3a7081959d0af3b8965d6a330276fdf428e8fe1e4c8"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.034449 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.037171 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.537144064 +0000 UTC m=+227.625053574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.056581 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" event={"ID":"80e7a49a-8aa3-41d2-b1bf-74a0689fbee6","Type":"ContainerStarted","Data":"e19b2ea148127f06f99ed19a64991de335f2efd8734d4b728af3b3283d0ac19d"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.086514 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-s276w" podStartSLOduration=157.086482573 podStartE2EDuration="2m37.086482573s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.036345699 +0000 UTC m=+227.124254999" watchObservedRunningTime="2026-03-14 05:30:44.086482573 +0000 UTC m=+227.174391873" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.121982 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" event={"ID":"68986d18-3623-49a9-88c3-983c5acf2e09","Type":"ContainerStarted","Data":"ad3c17d7a82cdaa67adf46d7507864e28a96e5ca94efedf548053dd30187497f"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.137195 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.139149 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.639102105 +0000 UTC m=+227.727011555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.193604 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vfvfc" podStartSLOduration=157.193573405 podStartE2EDuration="2m37.193573405s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.091073147 +0000 UTC m=+227.178982457" watchObservedRunningTime="2026-03-14 05:30:44.193573405 +0000 UTC m=+227.281482705" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.251966 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.253419 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.753403103 +0000 UTC m=+227.841312403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.256119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" event={"ID":"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec","Type":"ContainerStarted","Data":"baa3682466c9ae2c7012ba3eaf20f8ed949fab126a3415fca63fafa3b7102d68"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.256185 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.258887 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.266228 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.266378 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.282410 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.282517 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.304540 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" event={"ID":"5b056811-6e63-410a-b961-29b5fe78025d","Type":"ContainerStarted","Data":"ca09c29b16b724adf5c373b8a03158ebf03979fcc7716a4c1c51aaa77f1f9767"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.327610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerStarted","Data":"41f262140bd1b3146b013df15eab18ec3370e0bdcd4a75645ef2600b9cf271cd"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.327706 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerStarted","Data":"145b8bd8e2da8692d69016dd984b559d4e2eb2a2b74b30257b1ff536d1429536"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.329447 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.331280 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vmsm7" podStartSLOduration=157.331247606 podStartE2EDuration="2m37.331247606s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.196856568 +0000 UTC m=+227.284765878" watchObservedRunningTime="2026-03-14 05:30:44.331247606 +0000 UTC m=+227.419156896" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.357479 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.857448339 +0000 UTC m=+227.945357639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.357542 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.358000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.358413 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.858399239 +0000 UTC m=+227.946308539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.362711 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mw4tj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.362770 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.377572 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" podStartSLOduration=157.377549259 podStartE2EDuration="2m37.377549259s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.320743877 +0000 UTC m=+227.408653167" watchObservedRunningTime="2026-03-14 05:30:44.377549259 +0000 UTC m=+227.465458559" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.397557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-54p9f" event={"ID":"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5","Type":"ContainerStarted","Data":"e948a50ebeada5eeaa76fe1b64836271d25e35c8bf7e137e8b8118d91fc2a836"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.397630 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-54p9f" event={"ID":"a1866f7b-9063-44ab-86c3-5f0f38d9a2e5","Type":"ContainerStarted","Data":"2987d7c65817478089dce2467530a7bd153e6f87da0da4c0750751044065e348"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.429434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p46s4" event={"ID":"96a65c75-d6b8-41db-8e74-263b186c7596","Type":"ContainerStarted","Data":"f29b9a1209f1b3420ba3b44d23e4b51243b7ccacf8654bab89746aac4a9b4619"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.452023 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podStartSLOduration=157.452001667 podStartE2EDuration="2m37.452001667s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.450053426 +0000 UTC m=+227.537962736" watchObservedRunningTime="2026-03-14 05:30:44.452001667 +0000 UTC m=+227.539910967" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.452898 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" podStartSLOduration=157.452892516 podStartE2EDuration="2m37.452892516s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.380122461 +0000 UTC m=+227.468031771" watchObservedRunningTime="2026-03-14 05:30:44.452892516 +0000 UTC m=+227.540801806" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.455062 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" event={"ID":"848993f1-7cd4-405c-8f05-74b0e0a79730","Type":"ContainerStarted","Data":"207eb3ea2c26dec5c6d9f7c0e913cc110dc1fec74402ef3904b23f87d7dff030"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.461534 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.462637 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:44.962581569 +0000 UTC m=+228.050491009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.489404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" event={"ID":"5eb31828-9e51-4c7d-bc14-8787b4eca812","Type":"ContainerStarted","Data":"e93e4f33147aba630b6620f0d9cb087b1bc5abd32de677b53e682f0e3ce14ab4"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.516654 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" event={"ID":"ebe4aaf5-99ea-4f49-a6a9-88ce3e1c6037","Type":"ContainerStarted","Data":"3125bc6ac629132bcf9c69a15cbcab7ad1939ba542f878d74ae6b454b29d8fad"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.530666 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" event={"ID":"71586d21-5d3e-4ea9-840e-989af77915e8","Type":"ContainerStarted","Data":"740d0a2a92ffd21d11db37589719b71875653fd34067afe643858a24702a2cf8"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.548830 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-54p9f" podStartSLOduration=6.548804446 podStartE2EDuration="6.548804446s" podCreationTimestamp="2026-03-14 05:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.502984778 +0000 UTC m=+227.590894078" watchObservedRunningTime="2026-03-14 05:30:44.548804446 +0000 UTC m=+227.636713746" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.549991 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" podStartSLOduration=157.549981963 podStartE2EDuration="2m37.549981963s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.549612091 +0000 UTC m=+227.637521391" watchObservedRunningTime="2026-03-14 05:30:44.549981963 +0000 UTC m=+227.637891263" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.559683 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" event={"ID":"ebbfd4b5-734f-4e37-89af-c0f4f0904d94","Type":"ContainerStarted","Data":"b9eea7624e16db9136826380d3fa69534d47191c633b4ae653a8056922c3c6e4"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.565057 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.566190 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.066173411 +0000 UTC m=+228.154082711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.569685 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" event={"ID":"da4bb664-a24b-4644-9b7a-a0c6eda2c66f","Type":"ContainerStarted","Data":"450c6e52ec6a5edfbb0bb0de38472b0a24f097ed56db00c6e347debcd90953b9"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.574147 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.583591 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5f26d" event={"ID":"5e96825d-646e-44d7-b980-c7efcd08d2f3","Type":"ContainerStarted","Data":"6db5427dfe25ea40991c26d86b7c0bbc56bde1d4042e0cd8c743bab29f0ab5f3"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.597403 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" podStartSLOduration=158.597381761 podStartE2EDuration="2m38.597381761s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.588159711 +0000 UTC m=+227.676069011" watchObservedRunningTime="2026-03-14 05:30:44.597381761 +0000 UTC m=+227.685291051" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.608574 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-24csf" event={"ID":"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30","Type":"ContainerStarted","Data":"a488b76556d1b133bf7fb41bc3f703ae2dd262663d8e38c142edfa16016ce117"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.610227 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" event={"ID":"9d6db3a7-58c6-44ad-8bed-daf1086729ad","Type":"ContainerStarted","Data":"47da296d39a29c878589b14183e1753011e0a569e0a03fa8a6b7a320ca2c0943"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.610249 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" event={"ID":"9d6db3a7-58c6-44ad-8bed-daf1086729ad","Type":"ContainerStarted","Data":"e5b2df3ab4c5d3d642728aea7ee4fc82479e3aaaaef034cf6e3da84ed6a865d9"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.639587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" event={"ID":"ed4a1500-6481-4d26-a107-f76299623688","Type":"ContainerStarted","Data":"1531f4c8cfaba562861ee47ac560e9b6043488dc9aed0ac983051e6bea585645"} Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.639763 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.642004 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-84xqp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.642056 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.642158 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-whskd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.642174 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-whskd" podUID="fc9fcc69-c663-4474-b449-eee4c468cd4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.648465 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.649097 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.650659 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" podStartSLOduration=157.650639473 podStartE2EDuration="2m37.650639473s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.648874088 +0000 UTC m=+227.736783388" watchObservedRunningTime="2026-03-14 05:30:44.650639473 +0000 UTC m=+227.738548763" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.658365 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.673118 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.674531 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.174483041 +0000 UTC m=+228.262392341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.685527 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.685817 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.185802876 +0000 UTC m=+228.273712166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.705017 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-556hp" podStartSLOduration=158.704976169 podStartE2EDuration="2m38.704976169s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.703628306 +0000 UTC m=+227.791537606" watchObservedRunningTime="2026-03-14 05:30:44.704976169 +0000 UTC m=+227.792885479" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.790939 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.797710 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.297675968 +0000 UTC m=+228.385585268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.880360 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9lrmn" podStartSLOduration=157.880334273 podStartE2EDuration="2m37.880334273s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.875357397 +0000 UTC m=+227.963266697" watchObservedRunningTime="2026-03-14 05:30:44.880334273 +0000 UTC m=+227.968243573" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.908350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.916103 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podStartSLOduration=158.916077745 podStartE2EDuration="2m38.916077745s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.914908859 +0000 UTC m=+228.002818159" watchObservedRunningTime="2026-03-14 05:30:44.916077745 +0000 UTC m=+228.003987035" Mar 14 05:30:44 crc kubenswrapper[4713]: E0314 05:30:44.930482 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.430458576 +0000 UTC m=+228.518367876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:44 crc kubenswrapper[4713]: I0314 05:30:44.964391 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5f26d" podStartSLOduration=6.964359831 podStartE2EDuration="6.964359831s" podCreationTimestamp="2026-03-14 05:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.946978755 +0000 UTC m=+228.034888055" watchObservedRunningTime="2026-03-14 05:30:44.964359831 +0000 UTC m=+228.052269131" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.016688 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.016918 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.516872069 +0000 UTC m=+228.604781389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.017387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.019006 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.518991135 +0000 UTC m=+228.606900435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.124833 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.125301 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.625283031 +0000 UTC m=+228.713192331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.223409 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podStartSLOduration=158.223388711 podStartE2EDuration="2m38.223388711s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:44.96783088 +0000 UTC m=+228.055740180" watchObservedRunningTime="2026-03-14 05:30:45.223388711 +0000 UTC m=+228.311298011" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.223988 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.226251 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.226675 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.726658444 +0000 UTC m=+228.814567744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.267606 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:45 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:45 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:45 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.267704 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.291031 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.324554 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.324621 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.327035 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.327261 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.827226081 +0000 UTC m=+228.915135381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.327469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.328063 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.828019876 +0000 UTC m=+228.915929386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.428690 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.428890 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.928853491 +0000 UTC m=+229.016762791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.429254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.429665 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:45.929657057 +0000 UTC m=+229.017566357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.530121 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.530311 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.030272105 +0000 UTC m=+229.118181405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.530656 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.531045 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.031027798 +0000 UTC m=+229.118937098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.632515 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.632977 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.132940298 +0000 UTC m=+229.220849598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.661944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-24csf" event={"ID":"d6d1d72a-09c8-47bc-ad1a-ad11843c9a30","Type":"ContainerStarted","Data":"00f18de01fab18841093034edef0cbec366c234b9a1520496876040486dad9a4"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.670790 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" event={"ID":"d2608a13-9ee1-45ed-926b-329192ef4d34","Type":"ContainerStarted","Data":"58d1e11cdb6b911204b129bf9878c939b127cfb6725ecab7d2093c69416fa9b3"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.698074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p46s4" event={"ID":"96a65c75-d6b8-41db-8e74-263b186c7596","Type":"ContainerStarted","Data":"40ff019f441cb64f98643031b21c754fe3de43e34f5ad221690aa55a565ef0c4"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.698127 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p46s4" event={"ID":"96a65c75-d6b8-41db-8e74-263b186c7596","Type":"ContainerStarted","Data":"ce7acd976a406a6246c615420576a8840307a1f08d01bc97f20141207fc8de78"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.698354 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.719180 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" event={"ID":"0dbd3bea-9644-4bc5-96c7-822b26810706","Type":"ContainerStarted","Data":"13e6ab233ef0c13bdf1bd01ab727f6cea6a86571550f80519e78933b36a80c23"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.719309 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.727612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" event={"ID":"b00783a2-42c7-45b5-b83d-136c314b0086","Type":"ContainerStarted","Data":"682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.727825 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.730770 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.730836 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.731195 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" event={"ID":"1ce3a08b-0d84-46f7-aee4-c633105b323b","Type":"ContainerStarted","Data":"ca773624902b38ef72fc6cd0e62b5d5418af3918282731567806c9b7e06155c9"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.731253 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" event={"ID":"1ce3a08b-0d84-46f7-aee4-c633105b323b","Type":"ContainerStarted","Data":"4ecb5075d791b254b085af04d7ea6e5df2b0f6d5a70532d95ab73356e4a0001f"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.734112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.735570 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.235544868 +0000 UTC m=+229.323454368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.758623 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" event={"ID":"848993f1-7cd4-405c-8f05-74b0e0a79730","Type":"ContainerStarted","Data":"b670ea4b76de154ecb9d04cc6b67fbb67de5b88edb9f3faf4faa630729bd30ac"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.777559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hvgrc" event={"ID":"5b056811-6e63-410a-b961-29b5fe78025d","Type":"ContainerStarted","Data":"57e65ce1956f396d6ad811380021216730bee21b4fd61a8e022727010f319563"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.794386 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-24csf" podStartSLOduration=159.794365704 podStartE2EDuration="2m39.794365704s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:45.76713053 +0000 UTC m=+228.855039820" watchObservedRunningTime="2026-03-14 05:30:45.794365704 +0000 UTC m=+228.882275004" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.806964 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5vl8b" event={"ID":"71586d21-5d3e-4ea9-840e-989af77915e8","Type":"ContainerStarted","Data":"4b454ad33b8ade540575c31a9f0ac7fa849de6e128ab270522b7222129e17015"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.820841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" event={"ID":"b0c5d3d7-8d33-4eba-a572-3c702a05a6df","Type":"ContainerStarted","Data":"538226aa24b5c806db94a8d5f6d4f11ce78ab0d3d8f17c60ae1bc08d4a4f9971"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.835756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.836092 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.336056203 +0000 UTC m=+229.423965503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.836635 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.838232 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.338221522 +0000 UTC m=+229.426130822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.882545 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" event={"ID":"ebbfd4b5-734f-4e37-89af-c0f4f0904d94","Type":"ContainerStarted","Data":"070766bc2df98f5167118c8971311bf705aaddc5c100f4d5c1551e03ebf27ee2"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.939227 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:45 crc kubenswrapper[4713]: E0314 05:30:45.940959 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.440924025 +0000 UTC m=+229.528833325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.947071 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33806: no serving certificate available for the kubelet" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.951506 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" podStartSLOduration=45.951484977 podStartE2EDuration="45.951484977s" podCreationTimestamp="2026-03-14 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:45.824522542 +0000 UTC m=+228.912431842" watchObservedRunningTime="2026-03-14 05:30:45.951484977 +0000 UTC m=+229.039394277" Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.977476 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" event={"ID":"8c99344f-ed48-4193-9c8f-46c8f295ee0c","Type":"ContainerStarted","Data":"439b8267c5c814282a96bd84f839cfb729823850ffb7a0da7becc10e41259c26"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.989049 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-swm74" event={"ID":"5eb31828-9e51-4c7d-bc14-8787b4eca812","Type":"ContainerStarted","Data":"d100e587508f78d686148086319bd5fb59f0ee9e708db2679a7d881c6a8a6708"} Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.989318 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mw4tj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 14 05:30:45 crc kubenswrapper[4713]: I0314 05:30:45.989359 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.012622 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.012683 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.030808 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.034688 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.034876 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.041986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.043963 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.543946829 +0000 UTC m=+229.631856129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.047720 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.058601 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ph5z9" podStartSLOduration=160.058576379 podStartE2EDuration="2m40.058576379s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:45.95349659 +0000 UTC m=+229.041405890" watchObservedRunningTime="2026-03-14 05:30:46.058576379 +0000 UTC m=+229.146485679" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.059517 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podStartSLOduration=159.059510218 podStartE2EDuration="2m39.059510218s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.059384094 +0000 UTC m=+229.147293394" watchObservedRunningTime="2026-03-14 05:30:46.059510218 +0000 UTC m=+229.147419538" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.146168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.148353 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.648330126 +0000 UTC m=+229.736239486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.191371 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" podStartSLOduration=159.191341827 podStartE2EDuration="2m39.191341827s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.128634188 +0000 UTC m=+229.216543498" watchObservedRunningTime="2026-03-14 05:30:46.191341827 +0000 UTC m=+229.279251127" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.192329 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p46s4" podStartSLOduration=8.192323557 podStartE2EDuration="8.192323557s" podCreationTimestamp="2026-03-14 05:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.186595967 +0000 UTC m=+229.274505267" watchObservedRunningTime="2026-03-14 05:30:46.192323557 +0000 UTC m=+229.280232857" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.218484 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33812: no serving certificate available for the kubelet" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.255047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.255985 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.755959765 +0000 UTC m=+229.843869065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.267903 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:46 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:46 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:46 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.268248 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.321787 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bvnfp" podStartSLOduration=159.3217613 podStartE2EDuration="2m39.3217613s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.256419479 +0000 UTC m=+229.344328779" watchObservedRunningTime="2026-03-14 05:30:46.3217613 +0000 UTC m=+229.409670600" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.357904 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.358289 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.858259526 +0000 UTC m=+229.946168826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.377653 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33826: no serving certificate available for the kubelet" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.439470 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-l4d7n" podStartSLOduration=159.439450704 podStartE2EDuration="2m39.439450704s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.429215973 +0000 UTC m=+229.517125273" watchObservedRunningTime="2026-03-14 05:30:46.439450704 +0000 UTC m=+229.527360014" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.441360 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qddwj" podStartSLOduration=159.441354044 podStartE2EDuration="2m39.441354044s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.362810618 +0000 UTC m=+229.450719918" watchObservedRunningTime="2026-03-14 05:30:46.441354044 +0000 UTC m=+229.529263344" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.463066 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.463458 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:46.963445557 +0000 UTC m=+230.051354857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.563804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.564591 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.064569872 +0000 UTC m=+230.152479172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.564651 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33828: no serving certificate available for the kubelet" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.650529 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-592mg" podStartSLOduration=159.65050201 podStartE2EDuration="2m39.65050201s" podCreationTimestamp="2026-03-14 05:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:46.644996127 +0000 UTC m=+229.732905417" watchObservedRunningTime="2026-03-14 05:30:46.65050201 +0000 UTC m=+229.738411300" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.667172 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.667290 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.667322 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.667386 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.667413 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.669123 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.169100553 +0000 UTC m=+230.257009853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.670078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.674133 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.685859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.688943 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.723020 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33842: no serving certificate available for the kubelet" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.769594 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.769945 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.269928938 +0000 UTC m=+230.357838238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.876096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.876502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.376485443 +0000 UTC m=+230.464394743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.882059 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.892409 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.899548 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.977364 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.977703 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.477661559 +0000 UTC m=+230.565570859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:46 crc kubenswrapper[4713]: I0314 05:30:46.977773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:46 crc kubenswrapper[4713]: E0314 05:30:46.978142 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.478128564 +0000 UTC m=+230.566037864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.014710 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33854: no serving certificate available for the kubelet" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.079343 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.079560 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.579519836 +0000 UTC m=+230.667429146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.080099 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.080643 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.580621421 +0000 UTC m=+230.668530721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.100549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" event={"ID":"3f6d9ce2-7015-482b-8249-c1e1dfb09be3","Type":"ContainerStarted","Data":"3d28d11fa94b8b872ddb79b02d0a486dc5aca2b372cb790e17c31dc28dcc4c6f"} Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.101962 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mw4tj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.102037 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.102618 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" containerID="cri-o://664a55ffcd38c8805437725d6089ec3d4bf553b5974497a43a2e6295e9c819e4" gracePeriod=30 Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.103057 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" podUID="352eafb9-871b-4540-9083-d5a6c0340453" containerName="route-controller-manager" containerID="cri-o://6fb636c45382273dc52e75f9b6dc6d12f236760c3509bdb0c8611a2e1f868f15" gracePeriod=30 Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.113516 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlkm6" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.181037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.182454 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.682421707 +0000 UTC m=+230.770331157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.285792 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.286664 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.786643768 +0000 UTC m=+230.874553068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.313713 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:47 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:47 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:47 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.313818 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.314272 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.314308 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.333198 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.334781 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.340535 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.340588 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.353026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.377748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.386574 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33856: no serving certificate available for the kubelet" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.387847 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.388197 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.888174935 +0000 UTC m=+230.976084235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.492856 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.492944 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.492962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.493002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fpgn\" (UniqueName: \"kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.493304 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:47.993290726 +0000 UTC m=+231.081200026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.521943 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.539120 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.539274 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.550436 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.595497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.595806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.595829 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.595854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fpgn\" (UniqueName: \"kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.596351 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.09633285 +0000 UTC m=+231.184242150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.596744 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.596958 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.629701 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fpgn\" (UniqueName: \"kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn\") pod \"community-operators-jncw4\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.691282 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.693036 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.697629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.697692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4j2\" (UniqueName: \"kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.697736 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.697755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.698676 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.198661832 +0000 UTC m=+231.286571122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.702504 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:30:47 crc kubenswrapper[4713]: W0314 05:30:47.763288 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d14760c574a2d56fee3e8160ae30b879b6172f53d41f79d6c8e92a3deb2fb8c3 WatchSource:0}: Error finding container d14760c574a2d56fee3e8160ae30b879b6172f53d41f79d6c8e92a3deb2fb8c3: Status 404 returned error can't find the container with id d14760c574a2d56fee3e8160ae30b879b6172f53d41f79d6c8e92a3deb2fb8c3 Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.768842 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.789886 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33872: no serving certificate available for the kubelet" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.798840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799039 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799094 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4cng\" (UniqueName: \"kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4j2\" (UniqueName: \"kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799176 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799195 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.799231 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.799339 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.299317561 +0000 UTC m=+231.387226861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.804505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.804651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.852523 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4j2\" (UniqueName: \"kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2\") pod \"certified-operators-zbxl4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.869766 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.874562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.900316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4cng\" (UniqueName: \"kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.902466 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.902595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.902658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.903396 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: E0314 05:30:47.903428 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.403401689 +0000 UTC m=+231.491310989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.903641 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.909023 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.927759 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:30:47 crc kubenswrapper[4713]: I0314 05:30:47.936032 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4cng\" (UniqueName: \"kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng\") pod \"community-operators-2fddx\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.013457 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.013767 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.013798 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.013869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz7r9\" (UniqueName: \"kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.014101 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.514074523 +0000 UTC m=+231.601983823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.046668 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.047747 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.122791 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.122903 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.122923 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.122965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7r9\" (UniqueName: \"kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.123365 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.623342902 +0000 UTC m=+231.711252202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.123584 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.123917 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.146921 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d14760c574a2d56fee3e8160ae30b879b6172f53d41f79d6c8e92a3deb2fb8c3"} Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.167091 4713 generic.go:334] "Generic (PLEG): container finished" podID="352eafb9-871b-4540-9083-d5a6c0340453" containerID="6fb636c45382273dc52e75f9b6dc6d12f236760c3509bdb0c8611a2e1f868f15" exitCode=0 Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.167543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" event={"ID":"352eafb9-871b-4540-9083-d5a6c0340453","Type":"ContainerDied","Data":"6fb636c45382273dc52e75f9b6dc6d12f236760c3509bdb0c8611a2e1f868f15"} Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.171322 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7r9\" (UniqueName: \"kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9\") pod \"certified-operators-5nc5f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.194035 4713 generic.go:334] "Generic (PLEG): container finished" podID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerID="664a55ffcd38c8805437725d6089ec3d4bf553b5974497a43a2e6295e9c819e4" exitCode=0 Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.194104 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" event={"ID":"6491e577-c2d1-4c4b-b1d0-e82b34eec943","Type":"ContainerDied","Data":"664a55ffcd38c8805437725d6089ec3d4bf553b5974497a43a2e6295e9c819e4"} Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.197756 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" event={"ID":"3f6d9ce2-7015-482b-8249-c1e1dfb09be3","Type":"ContainerStarted","Data":"147943d1cbdf62b43b444591e11d633e5e896ef576cf449c24791bc39b8b6a11"} Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.224760 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.225321 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.725301923 +0000 UTC m=+231.813211213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.269884 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.307487 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:48 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:48 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:48 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.307563 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.327605 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.340540 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.84051606 +0000 UTC m=+231.928425360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.429979 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.430829 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:48.930806005 +0000 UTC m=+232.018715305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.500576 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33874: no serving certificate available for the kubelet" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.502976 4713 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.532689 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.533098 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.033084185 +0000 UTC m=+232.120993485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.567920 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.603265 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.605151 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.622021 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2608a13_9ee1_45ed_926b_329192ef4d34.slice/crio-conmon-58d1e11cdb6b911204b129bf9878c939b127cfb6725ecab7d2093c69416fa9b3.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.630362 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.633676 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.634259 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.13423374 +0000 UTC m=+232.222143040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.641468 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.642234 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352eafb9-871b-4540-9083-d5a6c0340453" containerName="route-controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.642253 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="352eafb9-871b-4540-9083-d5a6c0340453" containerName="route-controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.642285 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.642294 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.642435 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" containerName="controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.642449 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="352eafb9-871b-4540-9083-d5a6c0340453" containerName="route-controller-manager" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.642906 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.650285 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:30:48 crc kubenswrapper[4713]: W0314 05:30:48.682191 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaec2fda_bb55_4f4f_a487_8deeb4bf5da4.slice/crio-1fd34dcb4163de645994fa134aa019d806a4ccec8f942ea4f0e12295740e181d WatchSource:0}: Error finding container 1fd34dcb4163de645994fa134aa019d806a4ccec8f942ea4f0e12295740e181d: Status 404 returned error can't find the container with id 1fd34dcb4163de645994fa134aa019d806a4ccec8f942ea4f0e12295740e181d Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.686286 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:30:48 crc kubenswrapper[4713]: W0314 05:30:48.714131 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5416d4f_43bb_4ca5_a433_1e408bc69d26.slice/crio-a747263f5a8e8cf9d309d5cb69a79f043e6c5cb33cf9efc6b1ef6e81aeabc238 WatchSource:0}: Error finding container a747263f5a8e8cf9d309d5cb69a79f043e6c5cb33cf9efc6b1ef6e81aeabc238: Status 404 returned error can't find the container with id a747263f5a8e8cf9d309d5cb69a79f043e6c5cb33cf9efc6b1ef6e81aeabc238 Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.735660 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert\") pod \"352eafb9-871b-4540-9083-d5a6c0340453\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.735857 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca\") pod \"352eafb9-871b-4540-9083-d5a6c0340453\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.735943 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config\") pod \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.735967 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca\") pod \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert\") pod \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736035 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7kv\" (UniqueName: \"kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv\") pod \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles\") pod \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\" (UID: \"6491e577-c2d1-4c4b-b1d0-e82b34eec943\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736127 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqc9z\" (UniqueName: \"kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z\") pod \"352eafb9-871b-4540-9083-d5a6c0340453\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736165 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config\") pod \"352eafb9-871b-4540-9083-d5a6c0340453\" (UID: \"352eafb9-871b-4540-9083-d5a6c0340453\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736420 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736463 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.736546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.737844 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca" (OuterVolumeSpecName: "client-ca") pod "352eafb9-871b-4540-9083-d5a6c0340453" (UID: "352eafb9-871b-4540-9083-d5a6c0340453"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.737944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca" (OuterVolumeSpecName: "client-ca") pod "6491e577-c2d1-4c4b-b1d0-e82b34eec943" (UID: "6491e577-c2d1-4c4b-b1d0-e82b34eec943"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dcr\" (UniqueName: \"kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config" (OuterVolumeSpecName: "config") pod "6491e577-c2d1-4c4b-b1d0-e82b34eec943" (UID: "6491e577-c2d1-4c4b-b1d0-e82b34eec943"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738519 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738542 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.738552 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.738927 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.238913416 +0000 UTC m=+232.326822716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.739604 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config" (OuterVolumeSpecName: "config") pod "352eafb9-871b-4540-9083-d5a6c0340453" (UID: "352eafb9-871b-4540-9083-d5a6c0340453"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.740921 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6491e577-c2d1-4c4b-b1d0-e82b34eec943" (UID: "6491e577-c2d1-4c4b-b1d0-e82b34eec943"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.749511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "352eafb9-871b-4540-9083-d5a6c0340453" (UID: "352eafb9-871b-4540-9083-d5a6c0340453"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.749597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv" (OuterVolumeSpecName: "kube-api-access-qf7kv") pod "6491e577-c2d1-4c4b-b1d0-e82b34eec943" (UID: "6491e577-c2d1-4c4b-b1d0-e82b34eec943"). InnerVolumeSpecName "kube-api-access-qf7kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.751072 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z" (OuterVolumeSpecName: "kube-api-access-gqc9z") pod "352eafb9-871b-4540-9083-d5a6c0340453" (UID: "352eafb9-871b-4540-9083-d5a6c0340453"). InnerVolumeSpecName "kube-api-access-gqc9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.752198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6491e577-c2d1-4c4b-b1d0-e82b34eec943" (UID: "6491e577-c2d1-4c4b-b1d0-e82b34eec943"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841234 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841479 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8dcr\" (UniqueName: \"kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841537 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841557 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841612 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqc9z\" (UniqueName: \"kubernetes.io/projected/352eafb9-871b-4540-9083-d5a6c0340453-kube-api-access-gqc9z\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841622 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eafb9-871b-4540-9083-d5a6c0340453-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841632 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eafb9-871b-4540-9083-d5a6c0340453-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841644 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6491e577-c2d1-4c4b-b1d0-e82b34eec943-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841655 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7kv\" (UniqueName: \"kubernetes.io/projected/6491e577-c2d1-4c4b-b1d0-e82b34eec943-kube-api-access-qf7kv\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.841667 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6491e577-c2d1-4c4b-b1d0-e82b34eec943-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.842841 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.342792476 +0000 UTC m=+232.430701776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.844097 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.844195 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.849333 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.862264 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8dcr\" (UniqueName: \"kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr\") pod \"route-controller-manager-5c5654d57f-wxmkv\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.943182 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.945616 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:48 crc kubenswrapper[4713]: E0314 05:30:48.946155 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.446134951 +0000 UTC m=+232.534044251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:48 crc kubenswrapper[4713]: W0314 05:30:48.969617 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d58d6d_6f1d_423a_ab96_cc34aa8c4d3f.slice/crio-adadb202a96a06eea6e0e9f50242452b86140ad92ac840a20fbadd4c4904cb1d WatchSource:0}: Error finding container adadb202a96a06eea6e0e9f50242452b86140ad92ac840a20fbadd4c4904cb1d: Status 404 returned error can't find the container with id adadb202a96a06eea6e0e9f50242452b86140ad92ac840a20fbadd4c4904cb1d Mar 14 05:30:48 crc kubenswrapper[4713]: I0314 05:30:48.988226 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.048195 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.048413 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.54837805 +0000 UTC m=+232.636287360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.049122 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.049598 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.549587559 +0000 UTC m=+232.637496859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.153097 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.153502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.65347277 +0000 UTC m=+232.741382070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.218440 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"27ed5accec0c1abc16755758d02da1436750689981e272acb23cfcf972128a0c"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.218487 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0ea0e279d3cb83e36203c0d27c81f33dd48dcbee041603acac38cd20e0fdf106"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.223719 4713 generic.go:334] "Generic (PLEG): container finished" podID="d2608a13-9ee1-45ed-926b-329192ef4d34" containerID="58d1e11cdb6b911204b129bf9878c939b127cfb6725ecab7d2093c69416fa9b3" exitCode=0 Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.223810 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" event={"ID":"d2608a13-9ee1-45ed-926b-329192ef4d34","Type":"ContainerDied","Data":"58d1e11cdb6b911204b129bf9878c939b127cfb6725ecab7d2093c69416fa9b3"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.243355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"228aea35d4eda681e204952569dea2f876ffae43734fc6c0d2c3a5f193c30b20"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.244583 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.250819 4713 generic.go:334] "Generic (PLEG): container finished" podID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerID="5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522" exitCode=0 Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.250959 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerDied","Data":"5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.251043 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerStarted","Data":"a747263f5a8e8cf9d309d5cb69a79f043e6c5cb33cf9efc6b1ef6e81aeabc238"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.274982 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.275719 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.775645854 +0000 UTC m=+232.863555154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.283434 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.285698 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.291246 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.295178 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:49 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:49 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:49 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.295742 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.303978 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.311033 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" event={"ID":"6491e577-c2d1-4c4b-b1d0-e82b34eec943","Type":"ContainerDied","Data":"97bfc55a19ae2cd7f1b70ad1f6f12f8ef88ee5f3baf82af967dc20f526264bf8"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.311146 4713 scope.go:117] "RemoveContainer" containerID="664a55ffcd38c8805437725d6089ec3d4bf553b5974497a43a2e6295e9c819e4" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.315910 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84xqp" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.317714 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.321846 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.322729 4713 generic.go:334] "Generic (PLEG): container finished" podID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerID="6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2" exitCode=0 Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.322861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerDied","Data":"6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.322956 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerStarted","Data":"d63598bb0b7f68069e44c4c93c9ab911a55ec34b2015efdf19ae9fe4cd1981e2"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.327174 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.328593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln" event={"ID":"352eafb9-871b-4540-9083-d5a6c0340453","Type":"ContainerDied","Data":"98588770d435e057ed4ca5ab2eb5a7d4c2c221f6f68de5db06eb417fc1579ca6"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.331752 4713 generic.go:334] "Generic (PLEG): container finished" podID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerID="15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69" exitCode=0 Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.331848 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerDied","Data":"15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.331882 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerStarted","Data":"1fd34dcb4163de645994fa134aa019d806a4ccec8f942ea4f0e12295740e181d"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.337241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" event={"ID":"3f6d9ce2-7015-482b-8249-c1e1dfb09be3","Type":"ContainerStarted","Data":"188b573173b6548c643c8d5452ef3974880398a1ae0b013a44a57b55c9851be7"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.340248 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b0e3bad9e48216933c9c22f9adf8629ee79c4316eccd57db2451a107a460842b"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.340303 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8d59d127413a728d354ae9ebfe398eb36e16259546ff7221dd783a8de1e47c71"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.346101 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerStarted","Data":"adadb202a96a06eea6e0e9f50242452b86140ad92ac840a20fbadd4c4904cb1d"} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.377176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.377443 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.877404089 +0000 UTC m=+232.965313389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.377512 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.377602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmxt\" (UniqueName: \"kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.377634 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.377665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.379693 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.87967221 +0000 UTC m=+232.967581510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rd5mn" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.399068 4713 scope.go:117] "RemoveContainer" containerID="6fb636c45382273dc52e75f9b6dc6d12f236760c3509bdb0c8611a2e1f868f15" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.471307 4713 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-14T05:30:48.503022931Z","Handler":null,"Name":""} Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.478440 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.478595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmxt\" (UniqueName: \"kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.478674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.478790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.479641 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: E0314 05:30:49.480502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 05:30:49.980483325 +0000 UTC m=+233.068392625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.480907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.485256 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" podStartSLOduration=11.485223953 podStartE2EDuration="11.485223953s" podCreationTimestamp="2026-03-14 05:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:49.472938757 +0000 UTC m=+232.560848057" watchObservedRunningTime="2026-03-14 05:30:49.485223953 +0000 UTC m=+232.573133253" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.497146 4713 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.497232 4713 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.507854 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmxt\" (UniqueName: \"kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt\") pod \"redhat-marketplace-b9zt2\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.545321 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.549248 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84xqp"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.558559 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.574247 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6491e577-c2d1-4c4b-b1d0-e82b34eec943" path="/var/lib/kubelet/pods/6491e577-c2d1-4c4b-b1d0-e82b34eec943/volumes" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.574728 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p2ln"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.580408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.583715 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.583767 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.609989 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rd5mn\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.661595 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.665753 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.674362 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.720641 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.720955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.748932 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.824530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.824629 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.824697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889kj\" (UniqueName: \"kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.826027 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33882: no serving certificate available for the kubelet" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.870717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.926517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889kj\" (UniqueName: \"kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.926619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.926659 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.927919 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.928077 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:49 crc kubenswrapper[4713]: I0314 05:30:49.949500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889kj\" (UniqueName: \"kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj\") pod \"redhat-marketplace-rrk5s\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.010342 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.038439 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.039135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.042667 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.043640 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.045755 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 05:30:50 crc kubenswrapper[4713]: W0314 05:30:50.057138 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdf5393_1e5e_4965_a24c_b45a22c6053e.slice/crio-b0719048e481c066e911195f00ff58b9115f1e0ba83461bebb18393dab85d3ff WatchSource:0}: Error finding container b0719048e481c066e911195f00ff58b9115f1e0ba83461bebb18393dab85d3ff: Status 404 returned error can't find the container with id b0719048e481c066e911195f00ff58b9115f1e0ba83461bebb18393dab85d3ff Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.073485 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.129097 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.129150 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.231796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.231908 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.232014 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.255616 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.268870 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:50 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:50 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:50 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.269373 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.286723 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.286771 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.303311 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.337987 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.360069 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nwnrf" event={"ID":"3f6d9ce2-7015-482b-8249-c1e1dfb09be3","Type":"ContainerStarted","Data":"df52f07c30f4846b8ba2a2e943d5193e829ff1a4a81419358edf10ebc6f45cfd"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.363499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" event={"ID":"856889e7-9ce0-4695-9611-adbe441ac432","Type":"ContainerStarted","Data":"991e916c84398ffa84e223e63294b0ae59d231b2ed7f4999afe610124fffebaa"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.363555 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" event={"ID":"856889e7-9ce0-4695-9611-adbe441ac432","Type":"ContainerStarted","Data":"86929dde45026ba482dede597ceff0b7382fa059187f9181868ce27d4d1eeeaa"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.363579 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.375347 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerID="fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51" exitCode=0 Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.375491 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerDied","Data":"fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.380081 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.382727 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.382906 4713 generic.go:334] "Generic (PLEG): container finished" podID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerID="585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5" exitCode=0 Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.383315 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerDied","Data":"585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.383372 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerStarted","Data":"b0719048e481c066e911195f00ff58b9115f1e0ba83461bebb18393dab85d3ff"} Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.384378 4713 patch_prober.go:28] interesting pod/console-f9d7485db-rp4kf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.384427 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rp4kf" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.385552 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.389631 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" podStartSLOduration=4.389067615 podStartE2EDuration="4.389067615s" podCreationTimestamp="2026-03-14 05:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:50.387412924 +0000 UTC m=+233.475322224" watchObservedRunningTime="2026-03-14 05:30:50.389067615 +0000 UTC m=+233.476976915" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.405517 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-24csf" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.425126 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.426129 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.428603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.429802 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.429847 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.471718 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 05:30:50 crc kubenswrapper[4713]: W0314 05:30:50.478465 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcedfba83_e56b_4913_9ac5_b5bbf3e71b7a.slice/crio-9d6eac711d2c530ab59214a55c323c95b4477e628d0e7bec4ca5bd4c42342df3 WatchSource:0}: Error finding container 9d6eac711d2c530ab59214a55c323c95b4477e628d0e7bec4ca5bd4c42342df3: Status 404 returned error can't find the container with id 9d6eac711d2c530ab59214a55c323c95b4477e628d0e7bec4ca5bd4c42342df3 Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.486600 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-whskd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.486665 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-whskd" podUID="fc9fcc69-c663-4474-b449-eee4c468cd4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.486947 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-whskd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.487005 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-whskd" podUID="fc9fcc69-c663-4474-b449-eee4c468cd4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.542182 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.542267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.645518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.657345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.657581 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.673020 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.674451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.677018 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.684411 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.694911 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.788451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvzz\" (UniqueName: \"kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.788650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.788720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.788921 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.891561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.892014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.892091 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvzz\" (UniqueName: \"kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.892859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.893019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.904698 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:30:50 crc kubenswrapper[4713]: I0314 05:30:50.946793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvzz\" (UniqueName: \"kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz\") pod \"redhat-operators-dvtpw\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.056689 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.057737 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:30:51 crc kubenswrapper[4713]: E0314 05:30:51.058077 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2608a13-9ee1-45ed-926b-329192ef4d34" containerName="collect-profiles" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.058164 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2608a13-9ee1-45ed-926b-329192ef4d34" containerName="collect-profiles" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.062857 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2608a13-9ee1-45ed-926b-329192ef4d34" containerName="collect-profiles" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.071419 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.071982 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.113120 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.202401 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gd52\" (UniqueName: \"kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52\") pod \"d2608a13-9ee1-45ed-926b-329192ef4d34\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.202752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume\") pod \"d2608a13-9ee1-45ed-926b-329192ef4d34\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.202800 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume\") pod \"d2608a13-9ee1-45ed-926b-329192ef4d34\" (UID: \"d2608a13-9ee1-45ed-926b-329192ef4d34\") " Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.203029 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxh2k\" (UniqueName: \"kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.203062 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.203088 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.217570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume" (OuterVolumeSpecName: "config-volume") pod "d2608a13-9ee1-45ed-926b-329192ef4d34" (UID: "d2608a13-9ee1-45ed-926b-329192ef4d34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.218470 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d2608a13-9ee1-45ed-926b-329192ef4d34" (UID: "d2608a13-9ee1-45ed-926b-329192ef4d34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.218764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52" (OuterVolumeSpecName: "kube-api-access-4gd52") pod "d2608a13-9ee1-45ed-926b-329192ef4d34" (UID: "d2608a13-9ee1-45ed-926b-329192ef4d34"). InnerVolumeSpecName "kube-api-access-4gd52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.260734 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.266864 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:51 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:51 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:51 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.266935 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.267655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.299463 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.304982 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxh2k\" (UniqueName: \"kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.305050 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.305097 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.305282 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2608a13-9ee1-45ed-926b-329192ef4d34-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.305304 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gd52\" (UniqueName: \"kubernetes.io/projected/d2608a13-9ee1-45ed-926b-329192ef4d34-kube-api-access-4gd52\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.305314 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2608a13-9ee1-45ed-926b-329192ef4d34-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.311337 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.311606 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.333947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxh2k\" (UniqueName: \"kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k\") pod \"redhat-operators-2pwlb\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.378115 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.400740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"faa061f3-ae29-419e-8bd3-61f8dd7436a0","Type":"ContainerStarted","Data":"9d5ea3428053421dee34a43bb298484be3d6649e55277e1e99d31700d86b5fb5"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.409116 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.409822 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.409974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29" event={"ID":"d2608a13-9ee1-45ed-926b-329192ef4d34","Type":"ContainerDied","Data":"c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.410007 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ca1276eab42563dc2b2b62b12499ff1e2417b1c383720976dacf6daeacc6b3" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.410100 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.412954 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.413384 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.413737 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.413826 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.413944 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.420087 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.426162 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.426614 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.437094 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" event={"ID":"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a","Type":"ContainerStarted","Data":"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.437151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" event={"ID":"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a","Type":"ContainerStarted","Data":"9d6eac711d2c530ab59214a55c323c95b4477e628d0e7bec4ca5bd4c42342df3"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.437725 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.438140 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.505813 4713 generic.go:334] "Generic (PLEG): container finished" podID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerID="483e297d99cb08fdf08d230d119cfea9eefcd44168a425e6bdfe5bcd534af96c" exitCode=0 Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.507108 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerDied","Data":"483e297d99cb08fdf08d230d119cfea9eefcd44168a425e6bdfe5bcd534af96c"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.507164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerStarted","Data":"67eb6b22f881254a65791cdbe130e5f03e991cc431a433dbac3f63c70041279f"} Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.509831 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" podStartSLOduration=165.509816757 podStartE2EDuration="2m45.509816757s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:51.502182577 +0000 UTC m=+234.590091877" watchObservedRunningTime="2026-03-14 05:30:51.509816757 +0000 UTC m=+234.597726057" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.610721 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89bqp\" (UniqueName: \"kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.611109 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.611407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.611489 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.611581 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.699425 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352eafb9-871b-4540-9083-d5a6c0340453" path="/var/lib/kubelet/pods/352eafb9-871b-4540-9083-d5a6c0340453/volumes" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.700560 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.720036 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89bqp\" (UniqueName: \"kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.720088 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.720161 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.720182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.720224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.723643 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.725017 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.725585 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.736412 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.752650 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89bqp\" (UniqueName: \"kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp\") pod \"controller-manager-7f9cf7cff-2p6n4\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:51 crc kubenswrapper[4713]: I0314 05:30:51.775649 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.073337 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.170981 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.262862 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:52 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:52 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:52 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.262917 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.486248 4713 ???:1] "http: TLS handshake error from 192.168.126.11:33894: no serving certificate available for the kubelet" Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.550574 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.566065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerStarted","Data":"b5bfbbab6467e14cbc88c8a8bbb5ca5f278e232b2034c14b871230e85131ff16"} Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.572289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerStarted","Data":"ed2d030d694f118d6492da5d47346980a628e2af44d16db19158aab86023a6c1"} Mar 14 05:30:52 crc kubenswrapper[4713]: I0314 05:30:52.575670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"686d5c0e-b2c8-40a1-a6f3-bf913724d956","Type":"ContainerStarted","Data":"55bb41bbda976580d6780fcf42a0f06c99e42c9807530496eda567c2730ad26c"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.264558 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:53 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:53 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:53 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.264929 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.621745 4713 generic.go:334] "Generic (PLEG): container finished" podID="99ba127f-5518-4d4e-9581-10970dcb998c" containerID="94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452" exitCode=0 Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.621871 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerDied","Data":"94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.707847 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" event={"ID":"53a50175-ee14-4d0b-9037-b1cc213e10e6","Type":"ContainerStarted","Data":"0a15700f6cdf7699b14af79364f463ad48c96018b542f02d0948059fba682390"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.707906 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" event={"ID":"53a50175-ee14-4d0b-9037-b1cc213e10e6","Type":"ContainerStarted","Data":"ce3a9ad0d060536840441cb2310b1f82f07feecb9319de941ec3a96af86a7b0d"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.709871 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.713678 4713 generic.go:334] "Generic (PLEG): container finished" podID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerID="f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246" exitCode=0 Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.713723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerDied","Data":"f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.729818 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.740512 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" podStartSLOduration=7.740496439 podStartE2EDuration="7.740496439s" podCreationTimestamp="2026-03-14 05:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:30:53.738952511 +0000 UTC m=+236.826861821" watchObservedRunningTime="2026-03-14 05:30:53.740496439 +0000 UTC m=+236.828405739" Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.749719 4713 generic.go:334] "Generic (PLEG): container finished" podID="686d5c0e-b2c8-40a1-a6f3-bf913724d956" containerID="759e88894791992caa262323dd6c1e132c28a48e2f5c67659e562ae5952311d7" exitCode=0 Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.749839 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"686d5c0e-b2c8-40a1-a6f3-bf913724d956","Type":"ContainerDied","Data":"759e88894791992caa262323dd6c1e132c28a48e2f5c67659e562ae5952311d7"} Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.784860 4713 generic.go:334] "Generic (PLEG): container finished" podID="faa061f3-ae29-419e-8bd3-61f8dd7436a0" containerID="698ed94d0e713e089edc6ffe1bfb3a41fdc22584208037e37960907e3a2eaa14" exitCode=0 Mar 14 05:30:53 crc kubenswrapper[4713]: I0314 05:30:53.785234 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"faa061f3-ae29-419e-8bd3-61f8dd7436a0","Type":"ContainerDied","Data":"698ed94d0e713e089edc6ffe1bfb3a41fdc22584208037e37960907e3a2eaa14"} Mar 14 05:30:54 crc kubenswrapper[4713]: I0314 05:30:54.265270 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:54 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Mar 14 05:30:54 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:54 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:54 crc kubenswrapper[4713]: I0314 05:30:54.265420 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.274585 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 05:30:55 crc kubenswrapper[4713]: [+]has-synced ok Mar 14 05:30:55 crc kubenswrapper[4713]: [+]process-running ok Mar 14 05:30:55 crc kubenswrapper[4713]: healthz check failed Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.275189 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.432466 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.433077 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.546634 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir\") pod \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.546716 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access\") pod \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\" (UID: \"686d5c0e-b2c8-40a1-a6f3-bf913724d956\") " Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.546749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access\") pod \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.546825 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir\") pod \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\" (UID: \"faa061f3-ae29-419e-8bd3-61f8dd7436a0\") " Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.547254 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "faa061f3-ae29-419e-8bd3-61f8dd7436a0" (UID: "faa061f3-ae29-419e-8bd3-61f8dd7436a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.547311 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "686d5c0e-b2c8-40a1-a6f3-bf913724d956" (UID: "686d5c0e-b2c8-40a1-a6f3-bf913724d956"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.554093 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "686d5c0e-b2c8-40a1-a6f3-bf913724d956" (UID: "686d5c0e-b2c8-40a1-a6f3-bf913724d956"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.559897 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "faa061f3-ae29-419e-8bd3-61f8dd7436a0" (UID: "faa061f3-ae29-419e-8bd3-61f8dd7436a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.649258 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.649311 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.649321 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faa061f3-ae29-419e-8bd3-61f8dd7436a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.649330 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/686d5c0e-b2c8-40a1-a6f3-bf913724d956-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.869999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"faa061f3-ae29-419e-8bd3-61f8dd7436a0","Type":"ContainerDied","Data":"9d5ea3428053421dee34a43bb298484be3d6649e55277e1e99d31700d86b5fb5"} Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.870049 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d5ea3428053421dee34a43bb298484be3d6649e55277e1e99d31700d86b5fb5" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.870113 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.907135 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.907367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"686d5c0e-b2c8-40a1-a6f3-bf913724d956","Type":"ContainerDied","Data":"55bb41bbda976580d6780fcf42a0f06c99e42c9807530496eda567c2730ad26c"} Mar 14 05:30:55 crc kubenswrapper[4713]: I0314 05:30:55.907431 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55bb41bbda976580d6780fcf42a0f06c99e42c9807530496eda567c2730ad26c" Mar 14 05:30:56 crc kubenswrapper[4713]: I0314 05:30:56.262795 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:56 crc kubenswrapper[4713]: I0314 05:30:56.267982 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-s276w" Mar 14 05:30:56 crc kubenswrapper[4713]: I0314 05:30:56.378933 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p46s4" Mar 14 05:30:57 crc kubenswrapper[4713]: I0314 05:30:57.627193 4713 ???:1] "http: TLS handshake error from 192.168.126.11:58056: no serving certificate available for the kubelet" Mar 14 05:30:57 crc kubenswrapper[4713]: I0314 05:30:57.726022 4713 ???:1] "http: TLS handshake error from 192.168.126.11:58062: no serving certificate available for the kubelet" Mar 14 05:31:00 crc kubenswrapper[4713]: I0314 05:31:00.388999 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:31:00 crc kubenswrapper[4713]: I0314 05:31:00.393841 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:31:00 crc kubenswrapper[4713]: I0314 05:31:00.499216 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-whskd" Mar 14 05:31:01 crc kubenswrapper[4713]: I0314 05:31:01.358199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:31:01 crc kubenswrapper[4713]: I0314 05:31:01.361645 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 05:31:01 crc kubenswrapper[4713]: I0314 05:31:01.377360 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f2a1689-2973-4684-88d0-4ac7edb9b1d3-metrics-certs\") pod \"network-metrics-daemon-2t6mv\" (UID: \"8f2a1689-2973-4684-88d0-4ac7edb9b1d3\") " pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:31:01 crc kubenswrapper[4713]: I0314 05:31:01.485454 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 05:31:01 crc kubenswrapper[4713]: I0314 05:31:01.492937 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2t6mv" Mar 14 05:31:04 crc kubenswrapper[4713]: I0314 05:31:04.719101 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:31:04 crc kubenswrapper[4713]: I0314 05:31:04.720224 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerName="controller-manager" containerID="cri-o://0a15700f6cdf7699b14af79364f463ad48c96018b542f02d0948059fba682390" gracePeriod=30 Mar 14 05:31:04 crc kubenswrapper[4713]: I0314 05:31:04.743037 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:31:04 crc kubenswrapper[4713]: I0314 05:31:04.743340 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" podUID="856889e7-9ce0-4695-9611-adbe441ac432" containerName="route-controller-manager" containerID="cri-o://991e916c84398ffa84e223e63294b0ae59d231b2ed7f4999afe610124fffebaa" gracePeriod=30 Mar 14 05:31:06 crc kubenswrapper[4713]: I0314 05:31:06.057816 4713 generic.go:334] "Generic (PLEG): container finished" podID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerID="0a15700f6cdf7699b14af79364f463ad48c96018b542f02d0948059fba682390" exitCode=0 Mar 14 05:31:06 crc kubenswrapper[4713]: I0314 05:31:06.057939 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" event={"ID":"53a50175-ee14-4d0b-9037-b1cc213e10e6","Type":"ContainerDied","Data":"0a15700f6cdf7699b14af79364f463ad48c96018b542f02d0948059fba682390"} Mar 14 05:31:06 crc kubenswrapper[4713]: I0314 05:31:06.060908 4713 generic.go:334] "Generic (PLEG): container finished" podID="856889e7-9ce0-4695-9611-adbe441ac432" containerID="991e916c84398ffa84e223e63294b0ae59d231b2ed7f4999afe610124fffebaa" exitCode=0 Mar 14 05:31:06 crc kubenswrapper[4713]: I0314 05:31:06.060958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" event={"ID":"856889e7-9ce0-4695-9611-adbe441ac432","Type":"ContainerDied","Data":"991e916c84398ffa84e223e63294b0ae59d231b2ed7f4999afe610124fffebaa"} Mar 14 05:31:07 crc kubenswrapper[4713]: I0314 05:31:07.899252 4713 ???:1] "http: TLS handshake error from 192.168.126.11:43838: no serving certificate available for the kubelet" Mar 14 05:31:08 crc kubenswrapper[4713]: I0314 05:31:08.988126 4713 patch_prober.go:28] interesting pod/route-controller-manager-5c5654d57f-wxmkv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 14 05:31:08 crc kubenswrapper[4713]: I0314 05:31:08.988195 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" podUID="856889e7-9ce0-4695-9611-adbe441ac432" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 14 05:31:09 crc kubenswrapper[4713]: I0314 05:31:09.878253 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:31:10 crc kubenswrapper[4713]: I0314 05:31:10.732404 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:31:10 crc kubenswrapper[4713]: I0314 05:31:10.732502 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:31:11 crc kubenswrapper[4713]: I0314 05:31:11.776996 4713 patch_prober.go:28] interesting pod/controller-manager-7f9cf7cff-2p6n4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 14 05:31:11 crc kubenswrapper[4713]: I0314 05:31:11.777402 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.916384 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.954365 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:15 crc kubenswrapper[4713]: E0314 05:31:15.954770 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686d5c0e-b2c8-40a1-a6f3-bf913724d956" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.954794 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="686d5c0e-b2c8-40a1-a6f3-bf913724d956" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: E0314 05:31:15.954810 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerName="controller-manager" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.954820 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerName="controller-manager" Mar 14 05:31:15 crc kubenswrapper[4713]: E0314 05:31:15.954832 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa061f3-ae29-419e-8bd3-61f8dd7436a0" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.954841 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa061f3-ae29-419e-8bd3-61f8dd7436a0" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.954989 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa061f3-ae29-419e-8bd3-61f8dd7436a0" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.955010 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="686d5c0e-b2c8-40a1-a6f3-bf913724d956" containerName="pruner" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.955020 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" containerName="controller-manager" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.955669 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:15 crc kubenswrapper[4713]: I0314 05:31:15.957550 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020328 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles\") pod \"53a50175-ee14-4d0b-9037-b1cc213e10e6\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020466 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca\") pod \"53a50175-ee14-4d0b-9037-b1cc213e10e6\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020549 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89bqp\" (UniqueName: \"kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp\") pod \"53a50175-ee14-4d0b-9037-b1cc213e10e6\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020583 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config\") pod \"53a50175-ee14-4d0b-9037-b1cc213e10e6\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020627 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert\") pod \"53a50175-ee14-4d0b-9037-b1cc213e10e6\" (UID: \"53a50175-ee14-4d0b-9037-b1cc213e10e6\") " Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.020973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg6rh\" (UniqueName: \"kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.021025 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.021309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.021544 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "53a50175-ee14-4d0b-9037-b1cc213e10e6" (UID: "53a50175-ee14-4d0b-9037-b1cc213e10e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.021576 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "53a50175-ee14-4d0b-9037-b1cc213e10e6" (UID: "53a50175-ee14-4d0b-9037-b1cc213e10e6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.021909 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config" (OuterVolumeSpecName: "config") pod "53a50175-ee14-4d0b-9037-b1cc213e10e6" (UID: "53a50175-ee14-4d0b-9037-b1cc213e10e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.028083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53a50175-ee14-4d0b-9037-b1cc213e10e6" (UID: "53a50175-ee14-4d0b-9037-b1cc213e10e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.029132 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp" (OuterVolumeSpecName: "kube-api-access-89bqp") pod "53a50175-ee14-4d0b-9037-b1cc213e10e6" (UID: "53a50175-ee14-4d0b-9037-b1cc213e10e6"). InnerVolumeSpecName "kube-api-access-89bqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122660 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122713 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6rh\" (UniqueName: \"kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122831 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122918 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122970 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122982 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a50175-ee14-4d0b-9037-b1cc213e10e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.122993 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.123003 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a50175-ee14-4d0b-9037-b1cc213e10e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.123013 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89bqp\" (UniqueName: \"kubernetes.io/projected/53a50175-ee14-4d0b-9037-b1cc213e10e6-kube-api-access-89bqp\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.124150 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.124433 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.124818 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.126726 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" event={"ID":"53a50175-ee14-4d0b-9037-b1cc213e10e6","Type":"ContainerDied","Data":"ce3a9ad0d060536840441cb2310b1f82f07feecb9319de941ec3a96af86a7b0d"} Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.126783 4713 scope.go:117] "RemoveContainer" containerID="0a15700f6cdf7699b14af79364f463ad48c96018b542f02d0948059fba682390" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.126919 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.128277 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.139172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6rh\" (UniqueName: \"kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh\") pod \"controller-manager-668556b7cc-nh6xj\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.156410 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.159301 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f9cf7cff-2p6n4"] Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.301370 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:16 crc kubenswrapper[4713]: E0314 05:31:16.962044 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 14 05:31:16 crc kubenswrapper[4713]: E0314 05:31:16.962560 4713 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 05:31:16 crc kubenswrapper[4713]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 14 05:31:16 crc kubenswrapper[4713]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swwcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29557770-dmq2m_openshift-infra(62a03b8e-3b89-41c5-9399-a6ae0d44a53c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 14 05:31:16 crc kubenswrapper[4713]: > logger="UnhandledError" Mar 14 05:31:16 crc kubenswrapper[4713]: E0314 05:31:16.964158 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" Mar 14 05:31:16 crc kubenswrapper[4713]: I0314 05:31:16.976941 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.040514 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert\") pod \"856889e7-9ce0-4695-9611-adbe441ac432\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.040554 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca\") pod \"856889e7-9ce0-4695-9611-adbe441ac432\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.040582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8dcr\" (UniqueName: \"kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr\") pod \"856889e7-9ce0-4695-9611-adbe441ac432\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.040619 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config\") pod \"856889e7-9ce0-4695-9611-adbe441ac432\" (UID: \"856889e7-9ce0-4695-9611-adbe441ac432\") " Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.041712 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config" (OuterVolumeSpecName: "config") pod "856889e7-9ce0-4695-9611-adbe441ac432" (UID: "856889e7-9ce0-4695-9611-adbe441ac432"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.042740 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca" (OuterVolumeSpecName: "client-ca") pod "856889e7-9ce0-4695-9611-adbe441ac432" (UID: "856889e7-9ce0-4695-9611-adbe441ac432"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.045069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "856889e7-9ce0-4695-9611-adbe441ac432" (UID: "856889e7-9ce0-4695-9611-adbe441ac432"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.045119 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr" (OuterVolumeSpecName: "kube-api-access-v8dcr") pod "856889e7-9ce0-4695-9611-adbe441ac432" (UID: "856889e7-9ce0-4695-9611-adbe441ac432"). InnerVolumeSpecName "kube-api-access-v8dcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.133822 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" event={"ID":"856889e7-9ce0-4695-9611-adbe441ac432","Type":"ContainerDied","Data":"86929dde45026ba482dede597ceff0b7382fa059187f9181868ce27d4d1eeeaa"} Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.133852 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv" Mar 14 05:31:17 crc kubenswrapper[4713]: E0314 05:31:17.134942 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.141574 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.141602 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856889e7-9ce0-4695-9611-adbe441ac432-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.141613 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856889e7-9ce0-4695-9611-adbe441ac432-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.141622 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8dcr\" (UniqueName: \"kubernetes.io/projected/856889e7-9ce0-4695-9611-adbe441ac432-kube-api-access-v8dcr\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.176915 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.180807 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5654d57f-wxmkv"] Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.268251 4713 scope.go:117] "RemoveContainer" containerID="991e916c84398ffa84e223e63294b0ae59d231b2ed7f4999afe610124fffebaa" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.571031 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a50175-ee14-4d0b-9037-b1cc213e10e6" path="/var/lib/kubelet/pods/53a50175-ee14-4d0b-9037-b1cc213e10e6/volumes" Mar 14 05:31:17 crc kubenswrapper[4713]: I0314 05:31:17.571851 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856889e7-9ce0-4695-9611-adbe441ac432" path="/var/lib/kubelet/pods/856889e7-9ce0-4695-9611-adbe441ac432/volumes" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.388814 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:18 crc kubenswrapper[4713]: E0314 05:31:18.389736 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856889e7-9ce0-4695-9611-adbe441ac432" containerName="route-controller-manager" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.389762 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="856889e7-9ce0-4695-9611-adbe441ac432" containerName="route-controller-manager" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.389974 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="856889e7-9ce0-4695-9611-adbe441ac432" containerName="route-controller-manager" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.390709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.394114 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.394141 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.395715 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.396133 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.396362 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.396738 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.399527 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.460566 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6tr\" (UniqueName: \"kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.460655 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.460693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.460739 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.563141 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.563810 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6tr\" (UniqueName: \"kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.563873 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.563905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.565400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.565513 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.573495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.582066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6tr\" (UniqueName: \"kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr\") pod \"route-controller-manager-599c9f4b49-xm4pz\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:18 crc kubenswrapper[4713]: I0314 05:31:18.723847 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:20 crc kubenswrapper[4713]: I0314 05:31:20.229644 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2t6mv"] Mar 14 05:31:21 crc kubenswrapper[4713]: I0314 05:31:21.675618 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" Mar 14 05:31:24 crc kubenswrapper[4713]: I0314 05:31:24.707137 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:24 crc kubenswrapper[4713]: I0314 05:31:24.784373 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.231470 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.237419 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.287612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.290030 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.290126 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.367864 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.367920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.469395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.469468 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.469514 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.487960 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.603794 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:26 crc kubenswrapper[4713]: I0314 05:31:26.946072 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 05:31:27 crc kubenswrapper[4713]: E0314 05:31:27.481303 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 05:31:27 crc kubenswrapper[4713]: E0314 05:31:27.481499 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wz7r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5nc5f_openshift-marketplace(c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 05:31:27 crc kubenswrapper[4713]: E0314 05:31:27.482711 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5nc5f" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.793973 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5nc5f" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" Mar 14 05:31:28 crc kubenswrapper[4713]: W0314 05:31:28.795954 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2a1689_2973_4684_88d0_4ac7edb9b1d3.slice/crio-9f1e03b3f15773b432a5b24b64634e73241b1c2ea7f55f54985cf538e6c49a37 WatchSource:0}: Error finding container 9f1e03b3f15773b432a5b24b64634e73241b1c2ea7f55f54985cf538e6c49a37: Status 404 returned error can't find the container with id 9f1e03b3f15773b432a5b24b64634e73241b1c2ea7f55f54985cf538e6c49a37 Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.897389 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.897570 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fpgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jncw4_openshift-marketplace(c49b0182-cb22-4f55-b7a8-893646fa21fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.901328 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jncw4" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.914351 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.914624 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wmxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b9zt2_openshift-marketplace(2bdf5393-1e5e-4965-a24c-b45a22c6053e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.916908 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-b9zt2" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.944779 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.945058 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4cng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2fddx_openshift-marketplace(e5416d4f-43bb-4ca5-a433-1e408bc69d26): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.946297 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2fddx" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.994653 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.995140 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7p4j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zbxl4_openshift-marketplace(caec2fda-bb55-4f4f-a487-8deeb4bf5da4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 05:31:28 crc kubenswrapper[4713]: E0314 05:31:28.996461 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zbxl4" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" Mar 14 05:31:29 crc kubenswrapper[4713]: I0314 05:31:29.195120 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:29 crc kubenswrapper[4713]: I0314 05:31:29.207286 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" event={"ID":"8f2a1689-2973-4684-88d0-4ac7edb9b1d3","Type":"ContainerStarted","Data":"9f1e03b3f15773b432a5b24b64634e73241b1c2ea7f55f54985cf538e6c49a37"} Mar 14 05:31:29 crc kubenswrapper[4713]: E0314 05:31:29.216082 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zbxl4" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" Mar 14 05:31:29 crc kubenswrapper[4713]: E0314 05:31:29.216554 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2fddx" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" Mar 14 05:31:29 crc kubenswrapper[4713]: E0314 05:31:29.216644 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jncw4" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" Mar 14 05:31:29 crc kubenswrapper[4713]: E0314 05:31:29.224081 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-b9zt2" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" Mar 14 05:31:29 crc kubenswrapper[4713]: I0314 05:31:29.490901 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 05:31:29 crc kubenswrapper[4713]: I0314 05:31:29.510125 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:29 crc kubenswrapper[4713]: W0314 05:31:29.525297 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded214824_63d6_4290_9050_2c775b9c05bc.slice/crio-57d0a472eeda6ae14bb887cc3d1718d44c4084f6d9843c43efda4db32b246aba WatchSource:0}: Error finding container 57d0a472eeda6ae14bb887cc3d1718d44c4084f6d9843c43efda4db32b246aba: Status 404 returned error can't find the container with id 57d0a472eeda6ae14bb887cc3d1718d44c4084f6d9843c43efda4db32b246aba Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.213620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerStarted","Data":"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.215494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" event={"ID":"e95066d4-853e-4733-9144-38ecd2e8675f","Type":"ContainerStarted","Data":"3581ae3141cad20b19ad9bcfcfbaf4188b2c221212613e8b610694cef5e751e5"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.215520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" event={"ID":"e95066d4-853e-4733-9144-38ecd2e8675f","Type":"ContainerStarted","Data":"a3621a880eb3425b472cf473a2ab21f27a9922e0295025d9d195c318a0080e05"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.217660 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerStarted","Data":"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.218929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" event={"ID":"ed214824-63d6-4290-9050-2c775b9c05bc","Type":"ContainerStarted","Data":"57d0a472eeda6ae14bb887cc3d1718d44c4084f6d9843c43efda4db32b246aba"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.219974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2af6402f-655e-4e97-898d-4457762aa640","Type":"ContainerStarted","Data":"58e4226ecaa5842a08ee296e8f2b942415b986445d566f43d8ebdd934fcac952"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.222133 4713 generic.go:334] "Generic (PLEG): container finished" podID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerID="b1f5cfb739605bde2a0d7c1233a807d9711c2dabc24470031bdb2b2e85a38878" exitCode=0 Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.222246 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerDied","Data":"b1f5cfb739605bde2a0d7c1233a807d9711c2dabc24470031bdb2b2e85a38878"} Mar 14 05:31:30 crc kubenswrapper[4713]: I0314 05:31:30.223741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" event={"ID":"8f2a1689-2973-4684-88d0-4ac7edb9b1d3","Type":"ContainerStarted","Data":"e62219eaac038da81b7420439df4d961de36e50811ce98e7989245b6caa2ae37"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.230556 4713 generic.go:334] "Generic (PLEG): container finished" podID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerID="847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e" exitCode=0 Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.230661 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerDied","Data":"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.233566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" event={"ID":"ed214824-63d6-4290-9050-2c775b9c05bc","Type":"ContainerStarted","Data":"6abfb185a2a4236ca4e6570c69e2f4453762139f7ca97e670df729c511ec4bdd"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.233708 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" podUID="ed214824-63d6-4290-9050-2c775b9c05bc" containerName="route-controller-manager" containerID="cri-o://6abfb185a2a4236ca4e6570c69e2f4453762139f7ca97e670df729c511ec4bdd" gracePeriod=30 Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.233812 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.235121 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2af6402f-655e-4e97-898d-4457762aa640","Type":"ContainerStarted","Data":"44d266cc382d63227423b3005c2f728fafc0e6c0c7faa9bc42ab66889227999e"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.238132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2t6mv" event={"ID":"8f2a1689-2973-4684-88d0-4ac7edb9b1d3","Type":"ContainerStarted","Data":"9b7de81bc705d593940ccc4e32a82b31d03a494fb891ea7ec720cf413a3020de"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.242521 4713 generic.go:334] "Generic (PLEG): container finished" podID="99ba127f-5518-4d4e-9581-10970dcb998c" containerID="b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2" exitCode=0 Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.242582 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerDied","Data":"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2"} Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.242746 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" podUID="e95066d4-853e-4733-9144-38ecd2e8675f" containerName="controller-manager" containerID="cri-o://3581ae3141cad20b19ad9bcfcfbaf4188b2c221212613e8b610694cef5e751e5" gracePeriod=30 Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.244433 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.281273 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" podStartSLOduration=27.281253629 podStartE2EDuration="27.281253629s" podCreationTimestamp="2026-03-14 05:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:31.278685587 +0000 UTC m=+274.366594887" watchObservedRunningTime="2026-03-14 05:31:31.281253629 +0000 UTC m=+274.369162929" Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.304744 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" podStartSLOduration=27.304718315 podStartE2EDuration="27.304718315s" podCreationTimestamp="2026-03-14 05:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:31.302083972 +0000 UTC m=+274.389993272" watchObservedRunningTime="2026-03-14 05:31:31.304718315 +0000 UTC m=+274.392627615" Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.339541 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=5.339519848 podStartE2EDuration="5.339519848s" podCreationTimestamp="2026-03-14 05:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:31.332133655 +0000 UTC m=+274.420042955" watchObservedRunningTime="2026-03-14 05:31:31.339519848 +0000 UTC m=+274.427429148" Mar 14 05:31:31 crc kubenswrapper[4713]: I0314 05:31:31.362903 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2t6mv" podStartSLOduration=205.36287724 podStartE2EDuration="3m25.36287724s" podCreationTimestamp="2026-03-14 05:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:31.358241955 +0000 UTC m=+274.446151255" watchObservedRunningTime="2026-03-14 05:31:31.36287724 +0000 UTC m=+274.450786540" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.248799 4713 generic.go:334] "Generic (PLEG): container finished" podID="e95066d4-853e-4733-9144-38ecd2e8675f" containerID="3581ae3141cad20b19ad9bcfcfbaf4188b2c221212613e8b610694cef5e751e5" exitCode=0 Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.248886 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" event={"ID":"e95066d4-853e-4733-9144-38ecd2e8675f","Type":"ContainerDied","Data":"3581ae3141cad20b19ad9bcfcfbaf4188b2c221212613e8b610694cef5e751e5"} Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.250989 4713 generic.go:334] "Generic (PLEG): container finished" podID="ed214824-63d6-4290-9050-2c775b9c05bc" containerID="6abfb185a2a4236ca4e6570c69e2f4453762139f7ca97e670df729c511ec4bdd" exitCode=0 Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.251058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" event={"ID":"ed214824-63d6-4290-9050-2c775b9c05bc","Type":"ContainerDied","Data":"6abfb185a2a4236ca4e6570c69e2f4453762139f7ca97e670df729c511ec4bdd"} Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.252462 4713 generic.go:334] "Generic (PLEG): container finished" podID="2af6402f-655e-4e97-898d-4457762aa640" containerID="44d266cc382d63227423b3005c2f728fafc0e6c0c7faa9bc42ab66889227999e" exitCode=0 Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.252569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2af6402f-655e-4e97-898d-4457762aa640","Type":"ContainerDied","Data":"44d266cc382d63227423b3005c2f728fafc0e6c0c7faa9bc42ab66889227999e"} Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.568002 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.572813 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.603371 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:32 crc kubenswrapper[4713]: E0314 05:31:32.603696 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95066d4-853e-4733-9144-38ecd2e8675f" containerName="controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.603713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95066d4-853e-4733-9144-38ecd2e8675f" containerName="controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: E0314 05:31:32.603732 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed214824-63d6-4290-9050-2c775b9c05bc" containerName="route-controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.603739 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed214824-63d6-4290-9050-2c775b9c05bc" containerName="route-controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.603824 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95066d4-853e-4733-9144-38ecd2e8675f" containerName="controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.603839 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed214824-63d6-4290-9050-2c775b9c05bc" containerName="route-controller-manager" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.604306 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.609806 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.668586 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config\") pod \"ed214824-63d6-4290-9050-2c775b9c05bc\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.668955 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg6rh\" (UniqueName: \"kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh\") pod \"e95066d4-853e-4733-9144-38ecd2e8675f\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669178 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6tr\" (UniqueName: \"kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr\") pod \"ed214824-63d6-4290-9050-2c775b9c05bc\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669263 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca\") pod \"e95066d4-853e-4733-9144-38ecd2e8675f\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config\") pod \"e95066d4-853e-4733-9144-38ecd2e8675f\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669346 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert\") pod \"ed214824-63d6-4290-9050-2c775b9c05bc\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669386 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles\") pod \"e95066d4-853e-4733-9144-38ecd2e8675f\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669404 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca\") pod \"ed214824-63d6-4290-9050-2c775b9c05bc\" (UID: \"ed214824-63d6-4290-9050-2c775b9c05bc\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert\") pod \"e95066d4-853e-4733-9144-38ecd2e8675f\" (UID: \"e95066d4-853e-4733-9144-38ecd2e8675f\") " Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt84c\" (UniqueName: \"kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669833 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.669946 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config" (OuterVolumeSpecName: "config") pod "ed214824-63d6-4290-9050-2c775b9c05bc" (UID: "ed214824-63d6-4290-9050-2c775b9c05bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.670049 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca" (OuterVolumeSpecName: "client-ca") pod "e95066d4-853e-4733-9144-38ecd2e8675f" (UID: "e95066d4-853e-4733-9144-38ecd2e8675f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.670374 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e95066d4-853e-4733-9144-38ecd2e8675f" (UID: "e95066d4-853e-4733-9144-38ecd2e8675f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.670485 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed214824-63d6-4290-9050-2c775b9c05bc" (UID: "ed214824-63d6-4290-9050-2c775b9c05bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.670527 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config" (OuterVolumeSpecName: "config") pod "e95066d4-853e-4733-9144-38ecd2e8675f" (UID: "e95066d4-853e-4733-9144-38ecd2e8675f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.675096 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr" (OuterVolumeSpecName: "kube-api-access-rm6tr") pod "ed214824-63d6-4290-9050-2c775b9c05bc" (UID: "ed214824-63d6-4290-9050-2c775b9c05bc"). InnerVolumeSpecName "kube-api-access-rm6tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.687820 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e95066d4-853e-4733-9144-38ecd2e8675f" (UID: "e95066d4-853e-4733-9144-38ecd2e8675f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.687948 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh" (OuterVolumeSpecName: "kube-api-access-pg6rh") pod "e95066d4-853e-4733-9144-38ecd2e8675f" (UID: "e95066d4-853e-4733-9144-38ecd2e8675f"). InnerVolumeSpecName "kube-api-access-pg6rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.688001 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed214824-63d6-4290-9050-2c775b9c05bc" (UID: "ed214824-63d6-4290-9050-2c775b9c05bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771049 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt84c\" (UniqueName: \"kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771232 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771257 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771735 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.771980 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772405 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg6rh\" (UniqueName: \"kubernetes.io/projected/e95066d4-853e-4733-9144-38ecd2e8675f-kube-api-access-pg6rh\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772438 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6tr\" (UniqueName: \"kubernetes.io/projected/ed214824-63d6-4290-9050-2c775b9c05bc-kube-api-access-rm6tr\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772504 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772516 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772526 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed214824-63d6-4290-9050-2c775b9c05bc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772535 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e95066d4-853e-4733-9144-38ecd2e8675f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772565 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed214824-63d6-4290-9050-2c775b9c05bc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.772573 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e95066d4-853e-4733-9144-38ecd2e8675f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.773147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.774103 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.775331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.788121 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt84c\" (UniqueName: \"kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c\") pod \"controller-manager-5b6d5b4c97-v8rrk\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:32 crc kubenswrapper[4713]: I0314 05:31:32.940278 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.108251 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:33 crc kubenswrapper[4713]: W0314 05:31:33.115760 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de25776_134d_40b9_bf9f_365ed268f707.slice/crio-177a8aa1cbb57a8c9a47892fabfe34aa7034ffe790d197cc35337e219e922a2d WatchSource:0}: Error finding container 177a8aa1cbb57a8c9a47892fabfe34aa7034ffe790d197cc35337e219e922a2d: Status 404 returned error can't find the container with id 177a8aa1cbb57a8c9a47892fabfe34aa7034ffe790d197cc35337e219e922a2d Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.218200 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.218914 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.233810 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.278619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.278704 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.278755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.286771 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" event={"ID":"e95066d4-853e-4733-9144-38ecd2e8675f","Type":"ContainerDied","Data":"a3621a880eb3425b472cf473a2ab21f27a9922e0295025d9d195c318a0080e05"} Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.286836 4713 scope.go:117] "RemoveContainer" containerID="3581ae3141cad20b19ad9bcfcfbaf4188b2c221212613e8b610694cef5e751e5" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.286988 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668556b7cc-nh6xj" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.295110 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" event={"ID":"2de25776-134d-40b9-bf9f-365ed268f707","Type":"ContainerStarted","Data":"177a8aa1cbb57a8c9a47892fabfe34aa7034ffe790d197cc35337e219e922a2d"} Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.298185 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.298175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz" event={"ID":"ed214824-63d6-4290-9050-2c775b9c05bc","Type":"ContainerDied","Data":"57d0a472eeda6ae14bb887cc3d1718d44c4084f6d9843c43efda4db32b246aba"} Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.305640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerStarted","Data":"9f3155d3d17a8adbd0ffbecb325066464d7d69a0678f12fe26f8572c7496eff1"} Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.321627 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.333572 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-668556b7cc-nh6xj"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.333632 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.333643 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9f4b49-xm4pz"] Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.380240 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.380306 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.380337 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.380737 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.380772 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.407329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.571677 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95066d4-853e-4733-9144-38ecd2e8675f" path="/var/lib/kubelet/pods/e95066d4-853e-4733-9144-38ecd2e8675f/volumes" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.572568 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed214824-63d6-4290-9050-2c775b9c05bc" path="/var/lib/kubelet/pods/ed214824-63d6-4290-9050-2c775b9c05bc/volumes" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.600199 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.843535 4713 scope.go:117] "RemoveContainer" containerID="6abfb185a2a4236ca4e6570c69e2f4453762139f7ca97e670df729c511ec4bdd" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.882886 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.988327 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir\") pod \"2af6402f-655e-4e97-898d-4457762aa640\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.988456 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access\") pod \"2af6402f-655e-4e97-898d-4457762aa640\" (UID: \"2af6402f-655e-4e97-898d-4457762aa640\") " Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.988475 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2af6402f-655e-4e97-898d-4457762aa640" (UID: "2af6402f-655e-4e97-898d-4457762aa640"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.988739 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af6402f-655e-4e97-898d-4457762aa640-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:33 crc kubenswrapper[4713]: I0314 05:31:33.992060 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2af6402f-655e-4e97-898d-4457762aa640" (UID: "2af6402f-655e-4e97-898d-4457762aa640"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.090413 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af6402f-655e-4e97-898d-4457762aa640-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.319311 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2af6402f-655e-4e97-898d-4457762aa640","Type":"ContainerDied","Data":"58e4226ecaa5842a08ee296e8f2b942415b986445d566f43d8ebdd934fcac952"} Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.319356 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.319379 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e4226ecaa5842a08ee296e8f2b942415b986445d566f43d8ebdd934fcac952" Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.343144 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrk5s" podStartSLOduration=4.14703588 podStartE2EDuration="45.343120543s" podCreationTimestamp="2026-03-14 05:30:49 +0000 UTC" firstStartedPulling="2026-03-14 05:30:51.523597319 +0000 UTC m=+234.611506619" lastFinishedPulling="2026-03-14 05:31:32.719681982 +0000 UTC m=+275.807591282" observedRunningTime="2026-03-14 05:31:34.338445477 +0000 UTC m=+277.426354777" watchObservedRunningTime="2026-03-14 05:31:34.343120543 +0000 UTC m=+277.431029833" Mar 14 05:31:34 crc kubenswrapper[4713]: I0314 05:31:34.801964 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.328449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" event={"ID":"2de25776-134d-40b9-bf9f-365ed268f707","Type":"ContainerStarted","Data":"7261bfd35dcc35a378e94f78033b7777fa78602b3bdd913631561d729d39a856"} Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.329068 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.333100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18","Type":"ContainerStarted","Data":"305f74f56f3e1bb03186454ecde1b36cf00445122e3feec7edc823e450f2b4d4"} Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.333171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18","Type":"ContainerStarted","Data":"004d5dac22f6cde49d1bd11b3fd542a5672bc57473178531f191cd06f7d041d7"} Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.338460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerStarted","Data":"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427"} Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.339706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.358158 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" podStartSLOduration=11.358131244 podStartE2EDuration="11.358131244s" podCreationTimestamp="2026-03-14 05:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:35.349919327 +0000 UTC m=+278.437828627" watchObservedRunningTime="2026-03-14 05:31:35.358131244 +0000 UTC m=+278.446040544" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.400291 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:35 crc kubenswrapper[4713]: E0314 05:31:35.400588 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6402f-655e-4e97-898d-4457762aa640" containerName="pruner" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.400606 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6402f-655e-4e97-898d-4457762aa640" containerName="pruner" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.400720 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af6402f-655e-4e97-898d-4457762aa640" containerName="pruner" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.401245 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.406103 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.407811 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.407905 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.409757 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.409816 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.414850 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.447352 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.496512 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dvtpw" podStartSLOduration=4.535470934 podStartE2EDuration="45.496495608s" podCreationTimestamp="2026-03-14 05:30:50 +0000 UTC" firstStartedPulling="2026-03-14 05:30:53.634472161 +0000 UTC m=+236.722381461" lastFinishedPulling="2026-03-14 05:31:34.595496835 +0000 UTC m=+277.683406135" observedRunningTime="2026-03-14 05:31:35.45386225 +0000 UTC m=+278.541771550" watchObservedRunningTime="2026-03-14 05:31:35.496495608 +0000 UTC m=+278.584404908" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.509870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2zm\" (UniqueName: \"kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.509924 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.509958 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.510146 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.611306 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.611391 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2zm\" (UniqueName: \"kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.611414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.611441 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.612382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.613156 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.620522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.632056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2zm\" (UniqueName: \"kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm\") pod \"route-controller-manager-558786cf58-pqpf5\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:35 crc kubenswrapper[4713]: I0314 05:31:35.745780 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:36 crc kubenswrapper[4713]: I0314 05:31:36.345631 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerStarted","Data":"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a"} Mar 14 05:31:36 crc kubenswrapper[4713]: I0314 05:31:36.398236 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pwlb" podStartSLOduration=3.639189 podStartE2EDuration="45.398192523s" podCreationTimestamp="2026-03-14 05:30:51 +0000 UTC" firstStartedPulling="2026-03-14 05:30:53.718446517 +0000 UTC m=+236.806355817" lastFinishedPulling="2026-03-14 05:31:35.47745004 +0000 UTC m=+278.565359340" observedRunningTime="2026-03-14 05:31:36.397275714 +0000 UTC m=+279.485185014" watchObservedRunningTime="2026-03-14 05:31:36.398192523 +0000 UTC m=+279.486101833" Mar 14 05:31:36 crc kubenswrapper[4713]: I0314 05:31:36.398892 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.398884765 podStartE2EDuration="3.398884765s" podCreationTimestamp="2026-03-14 05:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:36.371430283 +0000 UTC m=+279.459339583" watchObservedRunningTime="2026-03-14 05:31:36.398884765 +0000 UTC m=+279.486794065" Mar 14 05:31:37 crc kubenswrapper[4713]: I0314 05:31:37.902903 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.394712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" event={"ID":"62a03b8e-3b89-41c5-9399-a6ae0d44a53c","Type":"ContainerStarted","Data":"1432c561d4c9474ed2598f4d46956f32e1aa324b223448f9db9515c236a70a3d"} Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.396488 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" event={"ID":"5fad1b56-3545-46d2-b33f-1e2887cb6e6a","Type":"ContainerStarted","Data":"f03fc4aa01255854faca8fd734cbdf76f01b594e47525187cc617a9d0955b559"} Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.396631 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.396733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" event={"ID":"5fad1b56-3545-46d2-b33f-1e2887cb6e6a","Type":"ContainerStarted","Data":"4a56f72af8553329fb8a2a1dc6c4297413a5e34b16c37633f9080799d87e6cc0"} Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.412366 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" podStartSLOduration=43.055787119 podStartE2EDuration="1m38.412336278s" podCreationTimestamp="2026-03-14 05:30:00 +0000 UTC" firstStartedPulling="2026-03-14 05:30:42.284598441 +0000 UTC m=+225.372507741" lastFinishedPulling="2026-03-14 05:31:37.6411476 +0000 UTC m=+280.729056900" observedRunningTime="2026-03-14 05:31:38.408061114 +0000 UTC m=+281.495970414" watchObservedRunningTime="2026-03-14 05:31:38.412336278 +0000 UTC m=+281.500245578" Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.436932 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" podStartSLOduration=14.43690986 podStartE2EDuration="14.43690986s" podCreationTimestamp="2026-03-14 05:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:38.433307337 +0000 UTC m=+281.521216657" watchObservedRunningTime="2026-03-14 05:31:38.43690986 +0000 UTC m=+281.524819170" Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.439958 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.619617 4713 csr.go:261] certificate signing request csr-kwccm is approved, waiting to be issued Mar 14 05:31:38 crc kubenswrapper[4713]: I0314 05:31:38.626995 4713 csr.go:257] certificate signing request csr-kwccm is issued Mar 14 05:31:39 crc kubenswrapper[4713]: I0314 05:31:39.404432 4713 generic.go:334] "Generic (PLEG): container finished" podID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" containerID="1432c561d4c9474ed2598f4d46956f32e1aa324b223448f9db9515c236a70a3d" exitCode=0 Mar 14 05:31:39 crc kubenswrapper[4713]: I0314 05:31:39.404503 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" event={"ID":"62a03b8e-3b89-41c5-9399-a6ae0d44a53c","Type":"ContainerDied","Data":"1432c561d4c9474ed2598f4d46956f32e1aa324b223448f9db9515c236a70a3d"} Mar 14 05:31:39 crc kubenswrapper[4713]: I0314 05:31:39.628971 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 04:38:44.776839141 +0000 UTC Mar 14 05:31:39 crc kubenswrapper[4713]: I0314 05:31:39.629342 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6095h7m5.14750352s for next certificate rotation Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.043336 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.044452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.172512 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.453239 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.525383 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.630645 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-27 03:41:38.920482668 +0000 UTC Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.630686 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6190h9m58.289799755s for next certificate rotation Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.729800 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.731516 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.731569 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.731638 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.732398 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.732471 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38" gracePeriod=600 Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.899524 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swwcn\" (UniqueName: \"kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn\") pod \"62a03b8e-3b89-41c5-9399-a6ae0d44a53c\" (UID: \"62a03b8e-3b89-41c5-9399-a6ae0d44a53c\") " Mar 14 05:31:40 crc kubenswrapper[4713]: I0314 05:31:40.906784 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn" (OuterVolumeSpecName: "kube-api-access-swwcn") pod "62a03b8e-3b89-41c5-9399-a6ae0d44a53c" (UID: "62a03b8e-3b89-41c5-9399-a6ae0d44a53c"). InnerVolumeSpecName "kube-api-access-swwcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.001910 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swwcn\" (UniqueName: \"kubernetes.io/projected/62a03b8e-3b89-41c5-9399-a6ae0d44a53c-kube-api-access-swwcn\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.114162 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.114235 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.424461 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" event={"ID":"62a03b8e-3b89-41c5-9399-a6ae0d44a53c","Type":"ContainerDied","Data":"6b8eac694448033f5042d86460f3120dafd3508761f8a39ebb9f19ca932e5e66"} Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.424813 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8eac694448033f5042d86460f3120dafd3508761f8a39ebb9f19ca932e5e66" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.424481 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557770-dmq2m" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.426733 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.427152 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.427368 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38" exitCode=0 Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.427714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38"} Mar 14 05:31:41 crc kubenswrapper[4713]: I0314 05:31:41.427793 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998"} Mar 14 05:31:42 crc kubenswrapper[4713]: I0314 05:31:42.154749 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dvtpw" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="registry-server" probeResult="failure" output=< Mar 14 05:31:42 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:31:42 crc kubenswrapper[4713]: > Mar 14 05:31:42 crc kubenswrapper[4713]: I0314 05:31:42.432997 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrk5s" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="registry-server" containerID="cri-o://9f3155d3d17a8adbd0ffbecb325066464d7d69a0678f12fe26f8572c7496eff1" gracePeriod=2 Mar 14 05:31:42 crc kubenswrapper[4713]: I0314 05:31:42.474746 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2pwlb" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="registry-server" probeResult="failure" output=< Mar 14 05:31:42 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:31:42 crc kubenswrapper[4713]: > Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.443282 4713 generic.go:334] "Generic (PLEG): container finished" podID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerID="9f3155d3d17a8adbd0ffbecb325066464d7d69a0678f12fe26f8572c7496eff1" exitCode=0 Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.443346 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerDied","Data":"9f3155d3d17a8adbd0ffbecb325066464d7d69a0678f12fe26f8572c7496eff1"} Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.569816 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.637928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889kj\" (UniqueName: \"kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj\") pod \"5979ce26-1ed6-49e0-ac41-24e70593ab24\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.638037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content\") pod \"5979ce26-1ed6-49e0-ac41-24e70593ab24\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.638128 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities\") pod \"5979ce26-1ed6-49e0-ac41-24e70593ab24\" (UID: \"5979ce26-1ed6-49e0-ac41-24e70593ab24\") " Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.639087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities" (OuterVolumeSpecName: "utilities") pod "5979ce26-1ed6-49e0-ac41-24e70593ab24" (UID: "5979ce26-1ed6-49e0-ac41-24e70593ab24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.640645 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.649682 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj" (OuterVolumeSpecName: "kube-api-access-889kj") pod "5979ce26-1ed6-49e0-ac41-24e70593ab24" (UID: "5979ce26-1ed6-49e0-ac41-24e70593ab24"). InnerVolumeSpecName "kube-api-access-889kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.684918 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5979ce26-1ed6-49e0-ac41-24e70593ab24" (UID: "5979ce26-1ed6-49e0-ac41-24e70593ab24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.741841 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889kj\" (UniqueName: \"kubernetes.io/projected/5979ce26-1ed6-49e0-ac41-24e70593ab24-kube-api-access-889kj\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:43 crc kubenswrapper[4713]: I0314 05:31:43.741913 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5979ce26-1ed6-49e0-ac41-24e70593ab24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.451766 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerID="adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a" exitCode=0 Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.451826 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerDied","Data":"adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a"} Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.456893 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk5s" event={"ID":"5979ce26-1ed6-49e0-ac41-24e70593ab24","Type":"ContainerDied","Data":"67eb6b22f881254a65791cdbe130e5f03e991cc431a433dbac3f63c70041279f"} Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.457116 4713 scope.go:117] "RemoveContainer" containerID="9f3155d3d17a8adbd0ffbecb325066464d7d69a0678f12fe26f8572c7496eff1" Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.457200 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk5s" Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.471971 4713 scope.go:117] "RemoveContainer" containerID="b1f5cfb739605bde2a0d7c1233a807d9711c2dabc24470031bdb2b2e85a38878" Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.499639 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.499826 4713 scope.go:117] "RemoveContainer" containerID="483e297d99cb08fdf08d230d119cfea9eefcd44168a425e6bdfe5bcd534af96c" Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.504584 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk5s"] Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.708554 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.708847 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" podUID="2de25776-134d-40b9-bf9f-365ed268f707" containerName="controller-manager" containerID="cri-o://7261bfd35dcc35a378e94f78033b7777fa78602b3bdd913631561d729d39a856" gracePeriod=30 Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.730343 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:44 crc kubenswrapper[4713]: I0314 05:31:44.730632 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerName="route-controller-manager" containerID="cri-o://f03fc4aa01255854faca8fd734cbdf76f01b594e47525187cc617a9d0955b559" gracePeriod=30 Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.478901 4713 generic.go:334] "Generic (PLEG): container finished" podID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerID="f03fc4aa01255854faca8fd734cbdf76f01b594e47525187cc617a9d0955b559" exitCode=0 Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.479293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" event={"ID":"5fad1b56-3545-46d2-b33f-1e2887cb6e6a","Type":"ContainerDied","Data":"f03fc4aa01255854faca8fd734cbdf76f01b594e47525187cc617a9d0955b559"} Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.480991 4713 generic.go:334] "Generic (PLEG): container finished" podID="2de25776-134d-40b9-bf9f-365ed268f707" containerID="7261bfd35dcc35a378e94f78033b7777fa78602b3bdd913631561d729d39a856" exitCode=0 Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.481115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" event={"ID":"2de25776-134d-40b9-bf9f-365ed268f707","Type":"ContainerDied","Data":"7261bfd35dcc35a378e94f78033b7777fa78602b3bdd913631561d729d39a856"} Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.570632 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" path="/var/lib/kubelet/pods/5979ce26-1ed6-49e0-ac41-24e70593ab24/volumes" Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.747329 4713 patch_prober.go:28] interesting pod/route-controller-manager-558786cf58-pqpf5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 14 05:31:45 crc kubenswrapper[4713]: I0314 05:31:45.747410 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.126339 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.137485 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.150804 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151074 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="registry-server" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151091 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="registry-server" Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151108 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerName="route-controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151115 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerName="route-controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151123 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" containerName="oc" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151129 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" containerName="oc" Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151139 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="extract-content" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151145 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="extract-content" Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151158 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de25776-134d-40b9-bf9f-365ed268f707" containerName="controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151164 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de25776-134d-40b9-bf9f-365ed268f707" containerName="controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: E0314 05:31:46.151172 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="extract-utilities" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151179 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="extract-utilities" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151291 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" containerName="oc" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151303 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" containerName="route-controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151311 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5979ce26-1ed6-49e0-ac41-24e70593ab24" containerName="registry-server" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151323 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de25776-134d-40b9-bf9f-365ed268f707" containerName="controller-manager" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.151719 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.182746 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278613 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert\") pod \"2de25776-134d-40b9-bf9f-365ed268f707\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278695 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles\") pod \"2de25776-134d-40b9-bf9f-365ed268f707\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278762 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config\") pod \"2de25776-134d-40b9-bf9f-365ed268f707\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278818 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config\") pod \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278841 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca\") pod \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.278862 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca\") pod \"2de25776-134d-40b9-bf9f-365ed268f707\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279076 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert\") pod \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt84c\" (UniqueName: \"kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c\") pod \"2de25776-134d-40b9-bf9f-365ed268f707\" (UID: \"2de25776-134d-40b9-bf9f-365ed268f707\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279140 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz2zm\" (UniqueName: \"kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm\") pod \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\" (UID: \"5fad1b56-3545-46d2-b33f-1e2887cb6e6a\") " Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279361 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtdz\" (UniqueName: \"kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.279554 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.280396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca" (OuterVolumeSpecName: "client-ca") pod "2de25776-134d-40b9-bf9f-365ed268f707" (UID: "2de25776-134d-40b9-bf9f-365ed268f707"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.280403 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca" (OuterVolumeSpecName: "client-ca") pod "5fad1b56-3545-46d2-b33f-1e2887cb6e6a" (UID: "5fad1b56-3545-46d2-b33f-1e2887cb6e6a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.280456 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2de25776-134d-40b9-bf9f-365ed268f707" (UID: "2de25776-134d-40b9-bf9f-365ed268f707"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.280547 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config" (OuterVolumeSpecName: "config") pod "5fad1b56-3545-46d2-b33f-1e2887cb6e6a" (UID: "5fad1b56-3545-46d2-b33f-1e2887cb6e6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.281439 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config" (OuterVolumeSpecName: "config") pod "2de25776-134d-40b9-bf9f-365ed268f707" (UID: "2de25776-134d-40b9-bf9f-365ed268f707"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.284945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm" (OuterVolumeSpecName: "kube-api-access-cz2zm") pod "5fad1b56-3545-46d2-b33f-1e2887cb6e6a" (UID: "5fad1b56-3545-46d2-b33f-1e2887cb6e6a"). InnerVolumeSpecName "kube-api-access-cz2zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.285515 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c" (OuterVolumeSpecName: "kube-api-access-kt84c") pod "2de25776-134d-40b9-bf9f-365ed268f707" (UID: "2de25776-134d-40b9-bf9f-365ed268f707"). InnerVolumeSpecName "kube-api-access-kt84c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.285505 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2de25776-134d-40b9-bf9f-365ed268f707" (UID: "2de25776-134d-40b9-bf9f-365ed268f707"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.285726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5fad1b56-3545-46d2-b33f-1e2887cb6e6a" (UID: "5fad1b56-3545-46d2-b33f-1e2887cb6e6a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.381630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtdz\" (UniqueName: \"kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.381755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.381882 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382262 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382304 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt84c\" (UniqueName: \"kubernetes.io/projected/2de25776-134d-40b9-bf9f-365ed268f707-kube-api-access-kt84c\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382333 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz2zm\" (UniqueName: \"kubernetes.io/projected/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-kube-api-access-cz2zm\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382358 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de25776-134d-40b9-bf9f-365ed268f707-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382381 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382408 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382431 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382454 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5fad1b56-3545-46d2-b33f-1e2887cb6e6a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.382479 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2de25776-134d-40b9-bf9f-365ed268f707-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.383740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.384615 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.385366 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.387636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.418768 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtdz\" (UniqueName: \"kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz\") pod \"controller-manager-5c8bd7c969-5rl6k\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.472149 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.492144 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" event={"ID":"2de25776-134d-40b9-bf9f-365ed268f707","Type":"ContainerDied","Data":"177a8aa1cbb57a8c9a47892fabfe34aa7034ffe790d197cc35337e219e922a2d"} Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.492235 4713 scope.go:117] "RemoveContainer" containerID="7261bfd35dcc35a378e94f78033b7777fa78602b3bdd913631561d729d39a856" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.493004 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.494159 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" event={"ID":"5fad1b56-3545-46d2-b33f-1e2887cb6e6a","Type":"ContainerDied","Data":"4a56f72af8553329fb8a2a1dc6c4297413a5e34b16c37633f9080799d87e6cc0"} Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.494264 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.510254 4713 scope.go:117] "RemoveContainer" containerID="f03fc4aa01255854faca8fd734cbdf76f01b594e47525187cc617a9d0955b559" Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.531667 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.548024 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d5b4c97-v8rrk"] Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.549659 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.552017 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-558786cf58-pqpf5"] Mar 14 05:31:46 crc kubenswrapper[4713]: I0314 05:31:46.672477 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:31:46 crc kubenswrapper[4713]: W0314 05:31:46.678656 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ddba7f7_97fc_4e47_9172_ad52a73dc091.slice/crio-6c1c32954c4a85d76950af36c5124ea526dd3f7dda367aeede6dbddfd88261c7 WatchSource:0}: Error finding container 6c1c32954c4a85d76950af36c5124ea526dd3f7dda367aeede6dbddfd88261c7: Status 404 returned error can't find the container with id 6c1c32954c4a85d76950af36c5124ea526dd3f7dda367aeede6dbddfd88261c7 Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.522791 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" event={"ID":"3ddba7f7-97fc-4e47-9172-ad52a73dc091","Type":"ContainerStarted","Data":"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6"} Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.523377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" event={"ID":"3ddba7f7-97fc-4e47-9172-ad52a73dc091","Type":"ContainerStarted","Data":"6c1c32954c4a85d76950af36c5124ea526dd3f7dda367aeede6dbddfd88261c7"} Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.523404 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.525539 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerStarted","Data":"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690"} Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.554918 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" podStartSLOduration=3.554894689 podStartE2EDuration="3.554894689s" podCreationTimestamp="2026-03-14 05:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:47.5545899 +0000 UTC m=+290.642499200" watchObservedRunningTime="2026-03-14 05:31:47.554894689 +0000 UTC m=+290.642803989" Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.571105 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de25776-134d-40b9-bf9f-365ed268f707" path="/var/lib/kubelet/pods/2de25776-134d-40b9-bf9f-365ed268f707/volumes" Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.571644 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fad1b56-3545-46d2-b33f-1e2887cb6e6a" path="/var/lib/kubelet/pods/5fad1b56-3545-46d2-b33f-1e2887cb6e6a/volumes" Mar 14 05:31:47 crc kubenswrapper[4713]: I0314 05:31:47.582068 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.413500 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.414578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.416558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.417132 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.417138 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.417280 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.417330 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.418656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.425919 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.522574 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.522636 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfhl\" (UniqueName: \"kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.522683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.522703 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.533565 4713 generic.go:334] "Generic (PLEG): container finished" podID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerID="5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690" exitCode=0 Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.533628 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerDied","Data":"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690"} Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.624281 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.624677 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfhl\" (UniqueName: \"kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.624733 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.624753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.625655 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.625779 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.630507 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.641588 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfhl\" (UniqueName: \"kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl\") pod \"route-controller-manager-6dbccbd94c-2pf8g\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:48 crc kubenswrapper[4713]: I0314 05:31:48.743791 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.441429 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.547305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerStarted","Data":"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.549827 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerStarted","Data":"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.556384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerStarted","Data":"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.559220 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerStarted","Data":"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.560933 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" event={"ID":"53964d3f-aab6-480e-93da-727e99502e1e","Type":"ContainerStarted","Data":"d33c933cfa43009f43128e73c3423dfd2dae811fb221037d0b35d19be716389b"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.583512 4713 generic.go:334] "Generic (PLEG): container finished" podID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerID="714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77" exitCode=0 Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.583568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerDied","Data":"714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77"} Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.583763 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbxl4" podStartSLOduration=2.905869648 podStartE2EDuration="1m3.583748437s" podCreationTimestamp="2026-03-14 05:30:47 +0000 UTC" firstStartedPulling="2026-03-14 05:30:49.335179554 +0000 UTC m=+232.423088854" lastFinishedPulling="2026-03-14 05:31:50.013058333 +0000 UTC m=+293.100967643" observedRunningTime="2026-03-14 05:31:50.568970303 +0000 UTC m=+293.656879613" watchObservedRunningTime="2026-03-14 05:31:50.583748437 +0000 UTC m=+293.671657737" Mar 14 05:31:50 crc kubenswrapper[4713]: I0314 05:31:50.642576 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5nc5f" podStartSLOduration=2.998836185 podStartE2EDuration="1m3.642556143s" podCreationTimestamp="2026-03-14 05:30:47 +0000 UTC" firstStartedPulling="2026-03-14 05:30:49.346735285 +0000 UTC m=+232.434644585" lastFinishedPulling="2026-03-14 05:31:49.990455223 +0000 UTC m=+293.078364543" observedRunningTime="2026-03-14 05:31:50.618954402 +0000 UTC m=+293.706863782" watchObservedRunningTime="2026-03-14 05:31:50.642556143 +0000 UTC m=+293.730465453" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.163765 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.209246 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.482808 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.526915 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.592696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" event={"ID":"53964d3f-aab6-480e-93da-727e99502e1e","Type":"ContainerStarted","Data":"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc"} Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.593131 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.594515 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerStarted","Data":"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b"} Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.596493 4713 generic.go:334] "Generic (PLEG): container finished" podID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerID="644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10" exitCode=0 Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.596560 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerDied","Data":"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10"} Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.598638 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.603951 4713 generic.go:334] "Generic (PLEG): container finished" podID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerID="3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7" exitCode=0 Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.604793 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerDied","Data":"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7"} Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.619436 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" podStartSLOduration=7.619415527 podStartE2EDuration="7.619415527s" podCreationTimestamp="2026-03-14 05:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:31:51.617072844 +0000 UTC m=+294.704982144" watchObservedRunningTime="2026-03-14 05:31:51.619415527 +0000 UTC m=+294.707324827" Mar 14 05:31:51 crc kubenswrapper[4713]: I0314 05:31:51.643805 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9zt2" podStartSLOduration=2.054871195 podStartE2EDuration="1m2.643781102s" podCreationTimestamp="2026-03-14 05:30:49 +0000 UTC" firstStartedPulling="2026-03-14 05:30:50.459199577 +0000 UTC m=+233.547108877" lastFinishedPulling="2026-03-14 05:31:51.048109484 +0000 UTC m=+294.136018784" observedRunningTime="2026-03-14 05:31:51.639373014 +0000 UTC m=+294.727282574" watchObservedRunningTime="2026-03-14 05:31:51.643781102 +0000 UTC m=+294.731690402" Mar 14 05:31:52 crc kubenswrapper[4713]: I0314 05:31:52.615873 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerStarted","Data":"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27"} Mar 14 05:31:52 crc kubenswrapper[4713]: I0314 05:31:52.620102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerStarted","Data":"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed"} Mar 14 05:31:52 crc kubenswrapper[4713]: I0314 05:31:52.634666 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2fddx" podStartSLOduration=2.866460611 podStartE2EDuration="1m5.634647836s" podCreationTimestamp="2026-03-14 05:30:47 +0000 UTC" firstStartedPulling="2026-03-14 05:30:49.255886654 +0000 UTC m=+232.343795954" lastFinishedPulling="2026-03-14 05:31:52.024073879 +0000 UTC m=+295.111983179" observedRunningTime="2026-03-14 05:31:52.633600503 +0000 UTC m=+295.721509803" watchObservedRunningTime="2026-03-14 05:31:52.634647836 +0000 UTC m=+295.722557126" Mar 14 05:31:52 crc kubenswrapper[4713]: I0314 05:31:52.653361 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jncw4" podStartSLOduration=2.884651511 podStartE2EDuration="1m5.653340963s" podCreationTimestamp="2026-03-14 05:30:47 +0000 UTC" firstStartedPulling="2026-03-14 05:30:49.324682683 +0000 UTC m=+232.412591993" lastFinishedPulling="2026-03-14 05:31:52.093372145 +0000 UTC m=+295.181281445" observedRunningTime="2026-03-14 05:31:52.651247677 +0000 UTC m=+295.739156987" watchObservedRunningTime="2026-03-14 05:31:52.653340963 +0000 UTC m=+295.741250253" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.015633 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.015904 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2pwlb" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="registry-server" containerID="cri-o://81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a" gracePeriod=2 Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.490921 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.610341 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities\") pod \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.610487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxh2k\" (UniqueName: \"kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k\") pod \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.610526 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content\") pod \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\" (UID: \"4cd7d2d9-c704-4019-9329-52c5fa68af0d\") " Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.611509 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities" (OuterVolumeSpecName: "utilities") pod "4cd7d2d9-c704-4019-9329-52c5fa68af0d" (UID: "4cd7d2d9-c704-4019-9329-52c5fa68af0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.617542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k" (OuterVolumeSpecName: "kube-api-access-mxh2k") pod "4cd7d2d9-c704-4019-9329-52c5fa68af0d" (UID: "4cd7d2d9-c704-4019-9329-52c5fa68af0d"). InnerVolumeSpecName "kube-api-access-mxh2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.636288 4713 generic.go:334] "Generic (PLEG): container finished" podID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerID="81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a" exitCode=0 Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.636345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerDied","Data":"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a"} Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.636392 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pwlb" event={"ID":"4cd7d2d9-c704-4019-9329-52c5fa68af0d","Type":"ContainerDied","Data":"ed2d030d694f118d6492da5d47346980a628e2af44d16db19158aab86023a6c1"} Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.636400 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pwlb" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.636414 4713 scope.go:117] "RemoveContainer" containerID="81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.652605 4713 scope.go:117] "RemoveContainer" containerID="847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.672687 4713 scope.go:117] "RemoveContainer" containerID="f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.698788 4713 scope.go:117] "RemoveContainer" containerID="81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a" Mar 14 05:31:54 crc kubenswrapper[4713]: E0314 05:31:54.699386 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a\": container with ID starting with 81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a not found: ID does not exist" containerID="81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.699448 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a"} err="failed to get container status \"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a\": rpc error: code = NotFound desc = could not find container \"81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a\": container with ID starting with 81f1ed791e22b92396c7b2d5517aae544311e170c0db4ff7fcf742c9eb7e8f4a not found: ID does not exist" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.699483 4713 scope.go:117] "RemoveContainer" containerID="847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e" Mar 14 05:31:54 crc kubenswrapper[4713]: E0314 05:31:54.699909 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e\": container with ID starting with 847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e not found: ID does not exist" containerID="847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.699945 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e"} err="failed to get container status \"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e\": rpc error: code = NotFound desc = could not find container \"847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e\": container with ID starting with 847244c2027d9c95ec81f020ffe0e1ab20a6adfa9e6169cd4b98fd4da989942e not found: ID does not exist" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.699966 4713 scope.go:117] "RemoveContainer" containerID="f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246" Mar 14 05:31:54 crc kubenswrapper[4713]: E0314 05:31:54.700291 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246\": container with ID starting with f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246 not found: ID does not exist" containerID="f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.700343 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246"} err="failed to get container status \"f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246\": rpc error: code = NotFound desc = could not find container \"f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246\": container with ID starting with f48186e332183d9a7fe837f7837695b63aed41a7cd8a09589594a281a6be2246 not found: ID does not exist" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.712552 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.712582 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxh2k\" (UniqueName: \"kubernetes.io/projected/4cd7d2d9-c704-4019-9329-52c5fa68af0d-kube-api-access-mxh2k\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.768524 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cd7d2d9-c704-4019-9329-52c5fa68af0d" (UID: "4cd7d2d9-c704-4019-9329-52c5fa68af0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.813832 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd7d2d9-c704-4019-9329-52c5fa68af0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.969696 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:31:54 crc kubenswrapper[4713]: I0314 05:31:54.973064 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2pwlb"] Mar 14 05:31:55 crc kubenswrapper[4713]: I0314 05:31:55.572581 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" path="/var/lib/kubelet/pods/4cd7d2d9-c704-4019-9329-52c5fa68af0d/volumes" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.703196 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.703506 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.754694 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.928669 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.929132 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:31:57 crc kubenswrapper[4713]: I0314 05:31:57.986510 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.048575 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.048641 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.093178 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.270884 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.271628 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.309013 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.702982 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.703873 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.722091 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:31:58 crc kubenswrapper[4713]: I0314 05:31:58.737442 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:31:59 crc kubenswrapper[4713]: I0314 05:31:59.721620 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:31:59 crc kubenswrapper[4713]: I0314 05:31:59.721695 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:31:59 crc kubenswrapper[4713]: I0314 05:31:59.783663 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:31:59 crc kubenswrapper[4713]: I0314 05:31:59.971882 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9rqn"] Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.018419 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.132558 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557772-flfn4"] Mar 14 05:32:00 crc kubenswrapper[4713]: E0314 05:32:00.132772 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="extract-content" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.132786 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="extract-content" Mar 14 05:32:00 crc kubenswrapper[4713]: E0314 05:32:00.132797 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="extract-utilities" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.132804 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="extract-utilities" Mar 14 05:32:00 crc kubenswrapper[4713]: E0314 05:32:00.132820 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="registry-server" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.132826 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="registry-server" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.132952 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd7d2d9-c704-4019-9329-52c5fa68af0d" containerName="registry-server" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.133360 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.137829 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.138055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.138232 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.142372 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557772-flfn4"] Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.216922 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.290783 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp85k\" (UniqueName: \"kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k\") pod \"auto-csr-approver-29557772-flfn4\" (UID: \"d53a961b-784c-4859-a121-59718c4c44a7\") " pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.391941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp85k\" (UniqueName: \"kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k\") pod \"auto-csr-approver-29557772-flfn4\" (UID: \"d53a961b-784c-4859-a121-59718c4c44a7\") " pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.411001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp85k\" (UniqueName: \"kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k\") pod \"auto-csr-approver-29557772-flfn4\" (UID: \"d53a961b-784c-4859-a121-59718c4c44a7\") " pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.449569 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.677856 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2fddx" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="registry-server" containerID="cri-o://c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27" gracePeriod=2 Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.678115 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5nc5f" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="registry-server" containerID="cri-o://f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c" gracePeriod=2 Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.728710 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:32:00 crc kubenswrapper[4713]: I0314 05:32:00.918806 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557772-flfn4"] Mar 14 05:32:00 crc kubenswrapper[4713]: W0314 05:32:00.931567 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd53a961b_784c_4859_a121_59718c4c44a7.slice/crio-1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c WatchSource:0}: Error finding container 1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c: Status 404 returned error can't find the container with id 1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.193772 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.271380 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.308712 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4cng\" (UniqueName: \"kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng\") pod \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.308857 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities\") pod \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.308882 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities\") pod \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.309823 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities" (OuterVolumeSpecName: "utilities") pod "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" (UID: "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.309842 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities" (OuterVolumeSpecName: "utilities") pod "e5416d4f-43bb-4ca5-a433-1e408bc69d26" (UID: "e5416d4f-43bb-4ca5-a433-1e408bc69d26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.309896 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz7r9\" (UniqueName: \"kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9\") pod \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.309928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content\") pod \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\" (UID: \"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.310411 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content\") pod \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\" (UID: \"e5416d4f-43bb-4ca5-a433-1e408bc69d26\") " Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.311404 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.311422 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.315241 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9" (OuterVolumeSpecName: "kube-api-access-wz7r9") pod "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" (UID: "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f"). InnerVolumeSpecName "kube-api-access-wz7r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.316597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng" (OuterVolumeSpecName: "kube-api-access-q4cng") pod "e5416d4f-43bb-4ca5-a433-1e408bc69d26" (UID: "e5416d4f-43bb-4ca5-a433-1e408bc69d26"). InnerVolumeSpecName "kube-api-access-q4cng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.377675 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" (UID: "c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.386006 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5416d4f-43bb-4ca5-a433-1e408bc69d26" (UID: "e5416d4f-43bb-4ca5-a433-1e408bc69d26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.413726 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz7r9\" (UniqueName: \"kubernetes.io/projected/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-kube-api-access-wz7r9\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.413776 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.413789 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5416d4f-43bb-4ca5-a433-1e408bc69d26-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.413800 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4cng\" (UniqueName: \"kubernetes.io/projected/e5416d4f-43bb-4ca5-a433-1e408bc69d26-kube-api-access-q4cng\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.684145 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerID="f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c" exitCode=0 Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.684215 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nc5f" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.684238 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerDied","Data":"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c"} Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.684291 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nc5f" event={"ID":"c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f","Type":"ContainerDied","Data":"adadb202a96a06eea6e0e9f50242452b86140ad92ac840a20fbadd4c4904cb1d"} Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.684309 4713 scope.go:117] "RemoveContainer" containerID="f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.685135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557772-flfn4" event={"ID":"d53a961b-784c-4859-a121-59718c4c44a7","Type":"ContainerStarted","Data":"1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c"} Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.688367 4713 generic.go:334] "Generic (PLEG): container finished" podID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerID="c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27" exitCode=0 Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.688454 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerDied","Data":"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27"} Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.688479 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fddx" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.688496 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fddx" event={"ID":"e5416d4f-43bb-4ca5-a433-1e408bc69d26","Type":"ContainerDied","Data":"a747263f5a8e8cf9d309d5cb69a79f043e6c5cb33cf9efc6b1ef6e81aeabc238"} Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.704029 4713 scope.go:117] "RemoveContainer" containerID="adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.708337 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.712831 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5nc5f"] Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.727734 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.727787 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2fddx"] Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.727911 4713 scope.go:117] "RemoveContainer" containerID="fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.758666 4713 scope.go:117] "RemoveContainer" containerID="f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.759297 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c\": container with ID starting with f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c not found: ID does not exist" containerID="f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.759361 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c"} err="failed to get container status \"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c\": rpc error: code = NotFound desc = could not find container \"f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c\": container with ID starting with f60fd5a45e83c51587d025de4fec8b8c33626a022cf8bb6e07590d04e2f9c53c not found: ID does not exist" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.759410 4713 scope.go:117] "RemoveContainer" containerID="adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.759845 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a\": container with ID starting with adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a not found: ID does not exist" containerID="adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.759901 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a"} err="failed to get container status \"adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a\": rpc error: code = NotFound desc = could not find container \"adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a\": container with ID starting with adb86c6daf54b08771c732f0d484ede9f32da7a832c0eb853804289b0139dd4a not found: ID does not exist" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.759934 4713 scope.go:117] "RemoveContainer" containerID="fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.763227 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51\": container with ID starting with fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51 not found: ID does not exist" containerID="fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.763265 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51"} err="failed to get container status \"fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51\": rpc error: code = NotFound desc = could not find container \"fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51\": container with ID starting with fdadbe54efbe673a417051c21055f7bf0ca8cb9fa47b893f7201a1268ad20a51 not found: ID does not exist" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.763287 4713 scope.go:117] "RemoveContainer" containerID="c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.777328 4713 scope.go:117] "RemoveContainer" containerID="644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.792039 4713 scope.go:117] "RemoveContainer" containerID="5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.813096 4713 scope.go:117] "RemoveContainer" containerID="c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.813829 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27\": container with ID starting with c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27 not found: ID does not exist" containerID="c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.813888 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27"} err="failed to get container status \"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27\": rpc error: code = NotFound desc = could not find container \"c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27\": container with ID starting with c9ebb2eff7044c84084046ddd05c1f8976c84c6763fbbd6e005e9214517cbe27 not found: ID does not exist" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.813939 4713 scope.go:117] "RemoveContainer" containerID="644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.814472 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10\": container with ID starting with 644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10 not found: ID does not exist" containerID="644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.814504 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10"} err="failed to get container status \"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10\": rpc error: code = NotFound desc = could not find container \"644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10\": container with ID starting with 644984b2e3e39da45753b3d7b0d1efc67627b53569a844a192ba9403ac865a10 not found: ID does not exist" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.814531 4713 scope.go:117] "RemoveContainer" containerID="5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522" Mar 14 05:32:01 crc kubenswrapper[4713]: E0314 05:32:01.814798 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522\": container with ID starting with 5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522 not found: ID does not exist" containerID="5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522" Mar 14 05:32:01 crc kubenswrapper[4713]: I0314 05:32:01.814842 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522"} err="failed to get container status \"5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522\": rpc error: code = NotFound desc = could not find container \"5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522\": container with ID starting with 5665e3bd321e3f66df1739d795e9eb430530fa004b0c09b742ea2b56b4e7a522 not found: ID does not exist" Mar 14 05:32:02 crc kubenswrapper[4713]: I0314 05:32:02.699859 4713 generic.go:334] "Generic (PLEG): container finished" podID="d53a961b-784c-4859-a121-59718c4c44a7" containerID="346e347bb9286f36a62d6ab11154e79a291c26ad5e8da9f5ef749a0cca2f6397" exitCode=0 Mar 14 05:32:02 crc kubenswrapper[4713]: I0314 05:32:02.699932 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557772-flfn4" event={"ID":"d53a961b-784c-4859-a121-59718c4c44a7","Type":"ContainerDied","Data":"346e347bb9286f36a62d6ab11154e79a291c26ad5e8da9f5ef749a0cca2f6397"} Mar 14 05:32:03 crc kubenswrapper[4713]: I0314 05:32:03.578101 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" path="/var/lib/kubelet/pods/c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f/volumes" Mar 14 05:32:03 crc kubenswrapper[4713]: I0314 05:32:03.578938 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" path="/var/lib/kubelet/pods/e5416d4f-43bb-4ca5-a433-1e408bc69d26/volumes" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.095821 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.177317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp85k\" (UniqueName: \"kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k\") pod \"d53a961b-784c-4859-a121-59718c4c44a7\" (UID: \"d53a961b-784c-4859-a121-59718c4c44a7\") " Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.191419 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k" (OuterVolumeSpecName: "kube-api-access-fp85k") pod "d53a961b-784c-4859-a121-59718c4c44a7" (UID: "d53a961b-784c-4859-a121-59718c4c44a7"). InnerVolumeSpecName "kube-api-access-fp85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.279016 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp85k\" (UniqueName: \"kubernetes.io/projected/d53a961b-784c-4859-a121-59718c4c44a7-kube-api-access-fp85k\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.715285 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557772-flfn4" event={"ID":"d53a961b-784c-4859-a121-59718c4c44a7","Type":"ContainerDied","Data":"1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c"} Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.715339 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f942ba626cc4d2d81250665ff5a1c51d4c147e88cc62f395989e31b21681c0c" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.715302 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557772-flfn4" Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.729114 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.729363 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" podUID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" containerName="controller-manager" containerID="cri-o://b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6" gracePeriod=30 Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.853442 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:32:04 crc kubenswrapper[4713]: I0314 05:32:04.853679 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" podUID="53964d3f-aab6-480e-93da-727e99502e1e" containerName="route-controller-manager" containerID="cri-o://07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc" gracePeriod=30 Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.357579 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.473169 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.494991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert\") pod \"53964d3f-aab6-480e-93da-727e99502e1e\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.495075 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca\") pod \"53964d3f-aab6-480e-93da-727e99502e1e\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.495105 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwfhl\" (UniqueName: \"kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl\") pod \"53964d3f-aab6-480e-93da-727e99502e1e\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.495176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config\") pod \"53964d3f-aab6-480e-93da-727e99502e1e\" (UID: \"53964d3f-aab6-480e-93da-727e99502e1e\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.495932 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "53964d3f-aab6-480e-93da-727e99502e1e" (UID: "53964d3f-aab6-480e-93da-727e99502e1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.495992 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config" (OuterVolumeSpecName: "config") pod "53964d3f-aab6-480e-93da-727e99502e1e" (UID: "53964d3f-aab6-480e-93da-727e99502e1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.496313 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.496336 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53964d3f-aab6-480e-93da-727e99502e1e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.500401 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl" (OuterVolumeSpecName: "kube-api-access-kwfhl") pod "53964d3f-aab6-480e-93da-727e99502e1e" (UID: "53964d3f-aab6-480e-93da-727e99502e1e"). InnerVolumeSpecName "kube-api-access-kwfhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.507980 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53964d3f-aab6-480e-93da-727e99502e1e" (UID: "53964d3f-aab6-480e-93da-727e99502e1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert\") pod \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597187 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqtdz\" (UniqueName: \"kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz\") pod \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles\") pod \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597296 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca\") pod \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597329 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config\") pod \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\" (UID: \"3ddba7f7-97fc-4e47-9172-ad52a73dc091\") " Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597510 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwfhl\" (UniqueName: \"kubernetes.io/projected/53964d3f-aab6-480e-93da-727e99502e1e-kube-api-access-kwfhl\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.597522 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53964d3f-aab6-480e-93da-727e99502e1e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.598199 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ddba7f7-97fc-4e47-9172-ad52a73dc091" (UID: "3ddba7f7-97fc-4e47-9172-ad52a73dc091"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.598315 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3ddba7f7-97fc-4e47-9172-ad52a73dc091" (UID: "3ddba7f7-97fc-4e47-9172-ad52a73dc091"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.598420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config" (OuterVolumeSpecName: "config") pod "3ddba7f7-97fc-4e47-9172-ad52a73dc091" (UID: "3ddba7f7-97fc-4e47-9172-ad52a73dc091"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.599872 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ddba7f7-97fc-4e47-9172-ad52a73dc091" (UID: "3ddba7f7-97fc-4e47-9172-ad52a73dc091"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.601013 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz" (OuterVolumeSpecName: "kube-api-access-kqtdz") pod "3ddba7f7-97fc-4e47-9172-ad52a73dc091" (UID: "3ddba7f7-97fc-4e47-9172-ad52a73dc091"). InnerVolumeSpecName "kube-api-access-kqtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.699118 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqtdz\" (UniqueName: \"kubernetes.io/projected/3ddba7f7-97fc-4e47-9172-ad52a73dc091-kube-api-access-kqtdz\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.699161 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.699171 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.699181 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ddba7f7-97fc-4e47-9172-ad52a73dc091-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.699299 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddba7f7-97fc-4e47-9172-ad52a73dc091-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.721468 4713 generic.go:334] "Generic (PLEG): container finished" podID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" containerID="b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6" exitCode=0 Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.721535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" event={"ID":"3ddba7f7-97fc-4e47-9172-ad52a73dc091","Type":"ContainerDied","Data":"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6"} Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.721559 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.721583 4713 scope.go:117] "RemoveContainer" containerID="b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.721568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k" event={"ID":"3ddba7f7-97fc-4e47-9172-ad52a73dc091","Type":"ContainerDied","Data":"6c1c32954c4a85d76950af36c5124ea526dd3f7dda367aeede6dbddfd88261c7"} Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.724249 4713 generic.go:334] "Generic (PLEG): container finished" podID="53964d3f-aab6-480e-93da-727e99502e1e" containerID="07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc" exitCode=0 Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.724289 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.724298 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" event={"ID":"53964d3f-aab6-480e-93da-727e99502e1e","Type":"ContainerDied","Data":"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc"} Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.724331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g" event={"ID":"53964d3f-aab6-480e-93da-727e99502e1e","Type":"ContainerDied","Data":"d33c933cfa43009f43128e73c3423dfd2dae811fb221037d0b35d19be716389b"} Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.737423 4713 scope.go:117] "RemoveContainer" containerID="b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6" Mar 14 05:32:05 crc kubenswrapper[4713]: E0314 05:32:05.737939 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6\": container with ID starting with b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6 not found: ID does not exist" containerID="b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.737993 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6"} err="failed to get container status \"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6\": rpc error: code = NotFound desc = could not find container \"b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6\": container with ID starting with b211eb1bed67957c4ff06eb3e76580943180fdc3b8789479b400cbe397aaf3a6 not found: ID does not exist" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.738024 4713 scope.go:117] "RemoveContainer" containerID="07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.744590 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.747014 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbccbd94c-2pf8g"] Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.753630 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.758215 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c8bd7c969-5rl6k"] Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.758508 4713 scope.go:117] "RemoveContainer" containerID="07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc" Mar 14 05:32:05 crc kubenswrapper[4713]: E0314 05:32:05.759134 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc\": container with ID starting with 07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc not found: ID does not exist" containerID="07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc" Mar 14 05:32:05 crc kubenswrapper[4713]: I0314 05:32:05.759165 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc"} err="failed to get container status \"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc\": rpc error: code = NotFound desc = could not find container \"07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc\": container with ID starting with 07b0ef967d508fd6efac5fa98ac49f9445ad6a9a53213877df0686ec0d5035fc not found: ID does not exist" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.431443 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt"] Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432067 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="extract-content" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.432164 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="extract-content" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432268 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53964d3f-aab6-480e-93da-727e99502e1e" containerName="route-controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.432361 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="53964d3f-aab6-480e-93da-727e99502e1e" containerName="route-controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432444 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53a961b-784c-4859-a121-59718c4c44a7" containerName="oc" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.432559 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53a961b-784c-4859-a121-59718c4c44a7" containerName="oc" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432646 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.432716 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432793 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="extract-utilities" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.432880 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="extract-utilities" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.432955 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" containerName="controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433027 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" containerName="controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.433100 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="extract-content" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433170 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="extract-content" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.433264 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="extract-utilities" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433353 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="extract-utilities" Mar 14 05:32:06 crc kubenswrapper[4713]: E0314 05:32:06.433430 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433510 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433740 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="53964d3f-aab6-480e-93da-727e99502e1e" containerName="route-controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433828 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53a961b-784c-4859-a121-59718c4c44a7" containerName="oc" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433902 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d58d6d-6f1d-423a-ab96-cc34aa8c4d3f" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.433979 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5416d4f-43bb-4ca5-a433-1e408bc69d26" containerName="registry-server" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.434071 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" containerName="controller-manager" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.434731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.434817 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56cb9c466-g7c95"] Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.435824 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.440688 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt"] Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446675 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446737 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446832 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446849 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446895 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.446956 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.447026 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.447385 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.448179 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.448308 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.448379 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.450367 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.451310 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56cb9c466-g7c95"] Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.461692 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.509954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfebbac0-ce4d-43c1-b872-293d64e8256b-serving-cert\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.510406 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzcz\" (UniqueName: \"kubernetes.io/projected/540bef84-9032-46b1-951a-9270e9cbbc9a-kube-api-access-9xzcz\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.510592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-client-ca\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.510751 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-config\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.510891 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-proxy-ca-bundles\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.511036 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4vm\" (UniqueName: \"kubernetes.io/projected/cfebbac0-ce4d-43c1-b872-293d64e8256b-kube-api-access-gm4vm\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.511191 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-client-ca\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.511366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-config\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.511515 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/540bef84-9032-46b1-951a-9270e9cbbc9a-serving-cert\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612656 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-client-ca\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612723 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-config\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/540bef84-9032-46b1-951a-9270e9cbbc9a-serving-cert\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612787 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfebbac0-ce4d-43c1-b872-293d64e8256b-serving-cert\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612815 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzcz\" (UniqueName: \"kubernetes.io/projected/540bef84-9032-46b1-951a-9270e9cbbc9a-kube-api-access-9xzcz\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612871 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-client-ca\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612907 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-config\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612923 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-proxy-ca-bundles\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.612961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4vm\" (UniqueName: \"kubernetes.io/projected/cfebbac0-ce4d-43c1-b872-293d64e8256b-kube-api-access-gm4vm\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.615036 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-client-ca\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.615090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-client-ca\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.616438 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-config\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.616458 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540bef84-9032-46b1-951a-9270e9cbbc9a-config\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.616473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfebbac0-ce4d-43c1-b872-293d64e8256b-proxy-ca-bundles\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.619446 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfebbac0-ce4d-43c1-b872-293d64e8256b-serving-cert\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.619659 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/540bef84-9032-46b1-951a-9270e9cbbc9a-serving-cert\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.631956 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4vm\" (UniqueName: \"kubernetes.io/projected/cfebbac0-ce4d-43c1-b872-293d64e8256b-kube-api-access-gm4vm\") pod \"controller-manager-56cb9c466-g7c95\" (UID: \"cfebbac0-ce4d-43c1-b872-293d64e8256b\") " pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.648904 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzcz\" (UniqueName: \"kubernetes.io/projected/540bef84-9032-46b1-951a-9270e9cbbc9a-kube-api-access-9xzcz\") pod \"route-controller-manager-6c9756c5df-htcbt\" (UID: \"540bef84-9032-46b1-951a-9270e9cbbc9a\") " pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.756346 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:06 crc kubenswrapper[4713]: I0314 05:32:06.763826 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.191376 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt"] Mar 14 05:32:07 crc kubenswrapper[4713]: W0314 05:32:07.199025 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540bef84_9032_46b1_951a_9270e9cbbc9a.slice/crio-e3689ded959eda1aa65b761058ab30c5d21a72ed56e71eaecb5ef89e56270b45 WatchSource:0}: Error finding container e3689ded959eda1aa65b761058ab30c5d21a72ed56e71eaecb5ef89e56270b45: Status 404 returned error can't find the container with id e3689ded959eda1aa65b761058ab30c5d21a72ed56e71eaecb5ef89e56270b45 Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.248000 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56cb9c466-g7c95"] Mar 14 05:32:07 crc kubenswrapper[4713]: W0314 05:32:07.259370 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfebbac0_ce4d_43c1_b872_293d64e8256b.slice/crio-1aff2a1f369e5991e1707239502487d1f88390579a21dac9f0ef5cf84d5b9e18 WatchSource:0}: Error finding container 1aff2a1f369e5991e1707239502487d1f88390579a21dac9f0ef5cf84d5b9e18: Status 404 returned error can't find the container with id 1aff2a1f369e5991e1707239502487d1f88390579a21dac9f0ef5cf84d5b9e18 Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.573419 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ddba7f7-97fc-4e47-9172-ad52a73dc091" path="/var/lib/kubelet/pods/3ddba7f7-97fc-4e47-9172-ad52a73dc091/volumes" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.573973 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53964d3f-aab6-480e-93da-727e99502e1e" path="/var/lib/kubelet/pods/53964d3f-aab6-480e-93da-727e99502e1e/volumes" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.741259 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" event={"ID":"cfebbac0-ce4d-43c1-b872-293d64e8256b","Type":"ContainerStarted","Data":"66bc3f03458a8f37e4fa4b11316c93aa2ef22850d6b0e1832e480de837fc941a"} Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.741300 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" event={"ID":"cfebbac0-ce4d-43c1-b872-293d64e8256b","Type":"ContainerStarted","Data":"1aff2a1f369e5991e1707239502487d1f88390579a21dac9f0ef5cf84d5b9e18"} Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.742279 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.749742 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" event={"ID":"540bef84-9032-46b1-951a-9270e9cbbc9a","Type":"ContainerStarted","Data":"01f3d0fa33a108eb57c269fe7ac1b047e66c75ed71d13bcce3f53e8a0e985294"} Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.749782 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" event={"ID":"540bef84-9032-46b1-951a-9270e9cbbc9a","Type":"ContainerStarted","Data":"e3689ded959eda1aa65b761058ab30c5d21a72ed56e71eaecb5ef89e56270b45"} Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.750227 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.751714 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.768957 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podStartSLOduration=3.768911981 podStartE2EDuration="3.768911981s" podCreationTimestamp="2026-03-14 05:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:32:07.764701069 +0000 UTC m=+310.852610369" watchObservedRunningTime="2026-03-14 05:32:07.768911981 +0000 UTC m=+310.856821281" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.816980 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podStartSLOduration=3.81696629 podStartE2EDuration="3.81696629s" podCreationTimestamp="2026-03-14 05:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:32:07.814343477 +0000 UTC m=+310.902252787" watchObservedRunningTime="2026-03-14 05:32:07.81696629 +0000 UTC m=+310.904875590" Mar 14 05:32:07 crc kubenswrapper[4713]: I0314 05:32:07.990706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.834914 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.835668 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.835813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836003 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06" gracePeriod=15 Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836029 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae" gracePeriod=15 Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836061 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0" gracePeriod=15 Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836149 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2" gracePeriod=15 Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836121 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580" gracePeriod=15 Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836655 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836820 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836834 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836844 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836853 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836862 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836869 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836876 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836883 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836893 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836901 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836912 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836918 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836925 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836931 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836942 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836950 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.836961 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.836968 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837076 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837093 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837102 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837109 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837120 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837134 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837143 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837151 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 05:32:12 crc kubenswrapper[4713]: E0314 05:32:12.837287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837299 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.837417 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.880147 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892022 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892115 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892182 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892232 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892260 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.892287 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.993962 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994029 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994097 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994177 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994237 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994311 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994357 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994379 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994440 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994461 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:12 crc kubenswrapper[4713]: I0314 05:32:12.994480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.178847 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:32:13 crc kubenswrapper[4713]: W0314 05:32:13.197694 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-eb969177e26ffa5462796179e472840cb9f16f353e7d942a362e9169a2a9f9ae WatchSource:0}: Error finding container eb969177e26ffa5462796179e472840cb9f16f353e7d942a362e9169a2a9f9ae: Status 404 returned error can't find the container with id eb969177e26ffa5462796179e472840cb9f16f353e7d942a362e9169a2a9f9ae Mar 14 05:32:13 crc kubenswrapper[4713]: E0314 05:32:13.201057 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c9e3bdb831195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:32:13.200191893 +0000 UTC m=+316.288101193,LastTimestamp:2026-03-14 05:32:13.200191893 +0000 UTC m=+316.288101193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.783906 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.785638 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.786556 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae" exitCode=0 Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.786585 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580" exitCode=0 Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.786599 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0" exitCode=0 Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.786609 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2" exitCode=2 Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.786633 4713 scope.go:117] "RemoveContainer" containerID="27909fa1ec69ff0cb631a0c5e0c2003af722dc5dd2eb95c55f172e23eee590fd" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.788257 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0"} Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.788345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eb969177e26ffa5462796179e472840cb9f16f353e7d942a362e9169a2a9f9ae"} Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.788852 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.790025 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" containerID="305f74f56f3e1bb03186454ecde1b36cf00445122e3feec7edc823e450f2b4d4" exitCode=0 Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.790059 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18","Type":"ContainerDied","Data":"305f74f56f3e1bb03186454ecde1b36cf00445122e3feec7edc823e450f2b4d4"} Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.790634 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:13 crc kubenswrapper[4713]: I0314 05:32:13.791167 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:14 crc kubenswrapper[4713]: I0314 05:32:14.798714 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.200382 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.201992 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.202675 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.202897 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.203094 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.203544 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.203973 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.204801 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.205153 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322814 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access\") pod \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322837 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir\") pod \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322850 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock\") pod \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\" (UID: \"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322920 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322951 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock" (OuterVolumeSpecName: "var-lock") pod "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" (UID: "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322995 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.322990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" (UID: "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.323356 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.323372 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.323382 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.323392 4713 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.323402 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.331422 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" (UID: "8c5dc3a2-9f22-4658-9ff1-e3d30f635a18"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.424673 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5dc3a2-9f22-4658-9ff1-e3d30f635a18-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.579505 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.812273 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.813302 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06" exitCode=0 Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.813380 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.813418 4713 scope.go:117] "RemoveContainer" containerID="cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.815611 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.816225 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c5dc3a2-9f22-4658-9ff1-e3d30f635a18","Type":"ContainerDied","Data":"004d5dac22f6cde49d1bd11b3fd542a5672bc57473178531f191cd06f7d041d7"} Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.816259 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004d5dac22f6cde49d1bd11b3fd542a5672bc57473178531f191cd06f7d041d7" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.816319 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.816537 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.817066 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.817983 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.820853 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.821261 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.822535 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.823089 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.823938 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.834419 4713 scope.go:117] "RemoveContainer" containerID="f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.855885 4713 scope.go:117] "RemoveContainer" containerID="77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.872244 4713 scope.go:117] "RemoveContainer" containerID="eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.898904 4713 scope.go:117] "RemoveContainer" containerID="f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.923272 4713 scope.go:117] "RemoveContainer" containerID="7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.949274 4713 scope.go:117] "RemoveContainer" containerID="cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.950818 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\": container with ID starting with cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae not found: ID does not exist" containerID="cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.950875 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae"} err="failed to get container status \"cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\": rpc error: code = NotFound desc = could not find container \"cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae\": container with ID starting with cb84eb3717c515774983135d28b52e56c96d5debbfbfdf30baf61ac3a3f2a1ae not found: ID does not exist" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.950912 4713 scope.go:117] "RemoveContainer" containerID="f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.951282 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\": container with ID starting with f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580 not found: ID does not exist" containerID="f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.951318 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580"} err="failed to get container status \"f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\": rpc error: code = NotFound desc = could not find container \"f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580\": container with ID starting with f98c0c7db067994208a1482670d3d6679f6102f060b1d87af8bafdc354cfd580 not found: ID does not exist" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.951338 4713 scope.go:117] "RemoveContainer" containerID="77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.951994 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\": container with ID starting with 77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0 not found: ID does not exist" containerID="77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.952059 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0"} err="failed to get container status \"77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\": rpc error: code = NotFound desc = could not find container \"77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0\": container with ID starting with 77a548a330152779195a13f00480aad463919a8c6f51e545ea79b22cd1d947c0 not found: ID does not exist" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.952103 4713 scope.go:117] "RemoveContainer" containerID="eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.953069 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\": container with ID starting with eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2 not found: ID does not exist" containerID="eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.953094 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2"} err="failed to get container status \"eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\": rpc error: code = NotFound desc = could not find container \"eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2\": container with ID starting with eede12060e198243232517113c7d8d05aefd6ded7bb988ebaa56675d1216c2b2 not found: ID does not exist" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.953109 4713 scope.go:117] "RemoveContainer" containerID="f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.953692 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\": container with ID starting with f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06 not found: ID does not exist" containerID="f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.953754 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06"} err="failed to get container status \"f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\": rpc error: code = NotFound desc = could not find container \"f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06\": container with ID starting with f6fb9f5e9b3d0422917d0893f3f938b5cbdc1c33fb275e2b12544e6f5fa8bf06 not found: ID does not exist" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.953798 4713 scope.go:117] "RemoveContainer" containerID="7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20" Mar 14 05:32:15 crc kubenswrapper[4713]: E0314 05:32:15.954173 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\": container with ID starting with 7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20 not found: ID does not exist" containerID="7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20" Mar 14 05:32:15 crc kubenswrapper[4713]: I0314 05:32:15.954199 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20"} err="failed to get container status \"7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\": rpc error: code = NotFound desc = could not find container \"7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20\": container with ID starting with 7e47da67ebe049474da1942a9e907ddb3db97ffe666277f2d599fa9cd3770c20 not found: ID does not exist" Mar 14 05:32:17 crc kubenswrapper[4713]: I0314 05:32:17.569260 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:17 crc kubenswrapper[4713]: I0314 05:32:17.570335 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:17 crc kubenswrapper[4713]: I0314 05:32:17.570913 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:17 crc kubenswrapper[4713]: E0314 05:32:17.616272 4713 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" volumeName="registry-storage" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.079438 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.079984 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.080427 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.080795 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.081144 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:18 crc kubenswrapper[4713]: I0314 05:32:18.081198 4713 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.081573 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.282941 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Mar 14 05:32:18 crc kubenswrapper[4713]: E0314 05:32:18.684154 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Mar 14 05:32:19 crc kubenswrapper[4713]: E0314 05:32:19.485538 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Mar 14 05:32:21 crc kubenswrapper[4713]: E0314 05:32:21.087478 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Mar 14 05:32:23 crc kubenswrapper[4713]: E0314 05:32:23.104319 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c9e3bdb831195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 05:32:13.200191893 +0000 UTC m=+316.288101193,LastTimestamp:2026-03-14 05:32:13.200191893 +0000 UTC m=+316.288101193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 05:32:24 crc kubenswrapper[4713]: E0314 05:32:24.289390 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="6.4s" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.563032 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.564299 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.564975 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.583348 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.583384 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:24 crc kubenswrapper[4713]: E0314 05:32:24.584056 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.585163 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:24 crc kubenswrapper[4713]: W0314 05:32:24.620173 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-306bc9bc09526c51922241b0b0873da8e2b25839296e5c5dce595711ee344757 WatchSource:0}: Error finding container 306bc9bc09526c51922241b0b0873da8e2b25839296e5c5dce595711ee344757: Status 404 returned error can't find the container with id 306bc9bc09526c51922241b0b0873da8e2b25839296e5c5dce595711ee344757 Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.878869 4713 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2c38d515a4d1549a4b2d0e4c5f40f5d7049cef89e56464804bcb32161bc46d65" exitCode=0 Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.878964 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2c38d515a4d1549a4b2d0e4c5f40f5d7049cef89e56464804bcb32161bc46d65"} Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.879096 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"306bc9bc09526c51922241b0b0873da8e2b25839296e5c5dce595711ee344757"} Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.879608 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.879661 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.879935 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:24 crc kubenswrapper[4713]: I0314 05:32:24.880162 4713 status_manager.go:851] "Failed to get status for pod" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 14 05:32:24 crc kubenswrapper[4713]: E0314 05:32:24.880151 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.035243 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" containerName="oauth-openshift" containerID="cri-o://5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7" gracePeriod=15 Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.520910 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579779 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579819 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579876 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579909 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579930 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.579978 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580004 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580019 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580245 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580271 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580346 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580365 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44gjz\" (UniqueName: \"kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580382 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580420 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data\") pod \"7ca75b92-342d-46ef-8307-92efc0200a55\" (UID: \"7ca75b92-342d-46ef-8307-92efc0200a55\") " Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.580978 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.581011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.581591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.581712 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.587554 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.588688 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.588985 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.589479 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz" (OuterVolumeSpecName: "kube-api-access-44gjz") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "kube-api-access-44gjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.589580 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.589808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.592593 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.592815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.592897 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7ca75b92-342d-46ef-8307-92efc0200a55" (UID: "7ca75b92-342d-46ef-8307-92efc0200a55"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682235 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682281 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682299 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682311 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682324 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682335 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44gjz\" (UniqueName: \"kubernetes.io/projected/7ca75b92-342d-46ef-8307-92efc0200a55-kube-api-access-44gjz\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682347 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682357 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682367 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682378 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682389 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ca75b92-342d-46ef-8307-92efc0200a55-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682399 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ca75b92-342d-46ef-8307-92efc0200a55-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682409 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.682420 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7ca75b92-342d-46ef-8307-92efc0200a55-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.888567 4713 generic.go:334] "Generic (PLEG): container finished" podID="7ca75b92-342d-46ef-8307-92efc0200a55" containerID="5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7" exitCode=0 Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.888638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" event={"ID":"7ca75b92-342d-46ef-8307-92efc0200a55","Type":"ContainerDied","Data":"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7"} Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.888671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" event={"ID":"7ca75b92-342d-46ef-8307-92efc0200a55","Type":"ContainerDied","Data":"69caafabd5ef72460d374a0f63ed84cd4b0313b00dc274f7a2181767fb4aa46a"} Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.888692 4713 scope.go:117] "RemoveContainer" containerID="5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.888798 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9rqn" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.906018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a0f29fcf595b4e6f6acf7a4d302b2e6c79604673254090c99df33d3e504273fe"} Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.906061 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81f9e865c6c0fc4194a77d10324e84f99ae5cfc22937e6d35ad2fae1a72a9d18"} Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.906072 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c3025d77e6819a8bb199b4075f9955a8e2b1365a8f95780b6b65cd58867ee525"} Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.925825 4713 scope.go:117] "RemoveContainer" containerID="5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7" Mar 14 05:32:25 crc kubenswrapper[4713]: E0314 05:32:25.926463 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7\": container with ID starting with 5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7 not found: ID does not exist" containerID="5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7" Mar 14 05:32:25 crc kubenswrapper[4713]: I0314 05:32:25.926511 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7"} err="failed to get container status \"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7\": rpc error: code = NotFound desc = could not find container \"5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7\": container with ID starting with 5285c0543bd6664907f15c97986e4531f13f40f4251c6d397bb1d9d84dee6ef7 not found: ID does not exist" Mar 14 05:32:26 crc kubenswrapper[4713]: I0314 05:32:26.914430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2dbea7078e101e20bfe76430e840372f552f68c3e2b55a901ae8dad0b06a97c5"} Mar 14 05:32:26 crc kubenswrapper[4713]: I0314 05:32:26.914475 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0c3851ed1befbebb9d76a7d05ae44821b3a86a5f7a4e5bf00bf6f2f0a41af9bd"} Mar 14 05:32:26 crc kubenswrapper[4713]: I0314 05:32:26.914578 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:26 crc kubenswrapper[4713]: I0314 05:32:26.914734 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:26 crc kubenswrapper[4713]: I0314 05:32:26.914759 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:27 crc kubenswrapper[4713]: I0314 05:32:27.925086 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:32:27 crc kubenswrapper[4713]: I0314 05:32:27.927123 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 05:32:27 crc kubenswrapper[4713]: I0314 05:32:27.927274 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b" exitCode=1 Mar 14 05:32:27 crc kubenswrapper[4713]: I0314 05:32:27.927324 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b"} Mar 14 05:32:27 crc kubenswrapper[4713]: I0314 05:32:27.928062 4713 scope.go:117] "RemoveContainer" containerID="b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b" Mar 14 05:32:28 crc kubenswrapper[4713]: I0314 05:32:28.939446 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:32:28 crc kubenswrapper[4713]: I0314 05:32:28.942327 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 05:32:28 crc kubenswrapper[4713]: I0314 05:32:28.942453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37dba197380ae2f1736b9692ed90cc69f0f419a93574b3897b13b3add1bee015"} Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.586324 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.586381 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.595238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.673352 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.673455 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 14 05:32:29 crc kubenswrapper[4713]: I0314 05:32:29.673501 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 14 05:32:30 crc kubenswrapper[4713]: I0314 05:32:30.333301 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:32:31 crc kubenswrapper[4713]: I0314 05:32:31.924252 4713 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:31 crc kubenswrapper[4713]: I0314 05:32:31.958740 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:31 crc kubenswrapper[4713]: I0314 05:32:31.958791 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:31 crc kubenswrapper[4713]: I0314 05:32:31.964513 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:31 crc kubenswrapper[4713]: I0314 05:32:31.967879 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b82bcecf-65d5-48a7-a4c1-3d05329d8ea5" Mar 14 05:32:32 crc kubenswrapper[4713]: I0314 05:32:32.965555 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:32 crc kubenswrapper[4713]: I0314 05:32:32.966077 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1346874-95a4-4f6c-9655-b7122d808169" Mar 14 05:32:37 crc kubenswrapper[4713]: I0314 05:32:37.592280 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b82bcecf-65d5-48a7-a4c1-3d05329d8ea5" Mar 14 05:32:38 crc kubenswrapper[4713]: I0314 05:32:38.107994 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 05:32:38 crc kubenswrapper[4713]: I0314 05:32:38.776790 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 05:32:39 crc kubenswrapper[4713]: I0314 05:32:39.505063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 05:32:39 crc kubenswrapper[4713]: I0314 05:32:39.673778 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 14 05:32:39 crc kubenswrapper[4713]: I0314 05:32:39.674258 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 14 05:32:39 crc kubenswrapper[4713]: I0314 05:32:39.741911 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 05:32:39 crc kubenswrapper[4713]: I0314 05:32:39.904870 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 05:32:41 crc kubenswrapper[4713]: I0314 05:32:41.095814 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 05:32:41 crc kubenswrapper[4713]: I0314 05:32:41.172746 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 05:32:41 crc kubenswrapper[4713]: I0314 05:32:41.409144 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 05:32:42 crc kubenswrapper[4713]: I0314 05:32:42.758524 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 05:32:42 crc kubenswrapper[4713]: I0314 05:32:42.863486 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 05:32:43 crc kubenswrapper[4713]: I0314 05:32:43.253929 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 05:32:43 crc kubenswrapper[4713]: I0314 05:32:43.632945 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 05:32:43 crc kubenswrapper[4713]: I0314 05:32:43.642979 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.184485 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.649388 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.651870 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.651847393 podStartE2EDuration="32.651847393s" podCreationTimestamp="2026-03-14 05:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:32:31.768850005 +0000 UTC m=+334.856759305" watchObservedRunningTime="2026-03-14 05:32:44.651847393 +0000 UTC m=+347.739756693" Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.655879 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9rqn","openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.655991 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.664058 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 05:32:44 crc kubenswrapper[4713]: I0314 05:32:44.682329 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.682310023 podStartE2EDuration="13.682310023s" podCreationTimestamp="2026-03-14 05:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:32:44.681435014 +0000 UTC m=+347.769344374" watchObservedRunningTime="2026-03-14 05:32:44.682310023 +0000 UTC m=+347.770219323" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.174153 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.245636 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.431812 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.546448 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.571179 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" path="/var/lib/kubelet/pods/7ca75b92-342d-46ef-8307-92efc0200a55/volumes" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.726685 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.743072 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.757872 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.817676 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 05:32:45 crc kubenswrapper[4713]: I0314 05:32:45.864975 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.041371 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.097083 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.128933 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.281957 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.282463 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.289002 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.312623 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.348629 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.492700 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.621225 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.705146 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.726735 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.755419 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.755486 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.759582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.821746 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.838687 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.856378 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 05:32:46 crc kubenswrapper[4713]: I0314 05:32:46.894195 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.038617 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.108257 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.111364 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.161472 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.188765 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.237247 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.268998 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.289928 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.392522 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.519515 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.741755 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 05:32:47 crc kubenswrapper[4713]: I0314 05:32:47.774710 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.021479 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.080442 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.135255 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.199830 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.258393 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.345982 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.431382 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.497713 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.510569 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.515427 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.533240 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.581107 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.630584 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.688666 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.925764 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 05:32:48 crc kubenswrapper[4713]: I0314 05:32:48.997461 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.027451 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.028665 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.032900 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.055829 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.070509 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.112623 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.133357 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.138063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.228276 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.230425 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.241457 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.467170 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.674430 4713 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.674502 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.674558 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.675185 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"37dba197380ae2f1736b9692ed90cc69f0f419a93574b3897b13b3add1bee015"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.675310 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://37dba197380ae2f1736b9692ed90cc69f0f419a93574b3897b13b3add1bee015" gracePeriod=30 Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.869063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.889064 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 05:32:49 crc kubenswrapper[4713]: I0314 05:32:49.942669 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.044383 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.055703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.137347 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.159576 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.192281 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.270158 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.301356 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.347507 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.360230 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.367152 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.413830 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.436021 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.554187 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.663293 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.739718 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.741333 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.860725 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.908152 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.958024 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 05:32:50 crc kubenswrapper[4713]: I0314 05:32:50.989138 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.143308 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.154239 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.180260 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.356949 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.369307 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.383901 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.405525 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.462364 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.578342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.595354 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.618441 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.660867 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.685890 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.691881 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.824280 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 05:32:51 crc kubenswrapper[4713]: I0314 05:32:51.953860 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.093436 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.190118 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.195080 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.242175 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.333305 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.373630 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.413558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.475815 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.567026 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.675042 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.711624 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.770121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.829613 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.904239 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 05:32:52 crc kubenswrapper[4713]: I0314 05:32:52.968098 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.038921 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.061394 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.136134 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.188223 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.195273 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.395505 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.409992 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.605384 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.608868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.610375 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.629753 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.714381 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7dc7444945-bll47"] Mar 14 05:32:53 crc kubenswrapper[4713]: E0314 05:32:53.714608 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" containerName="oauth-openshift" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.714624 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" containerName="oauth-openshift" Mar 14 05:32:53 crc kubenswrapper[4713]: E0314 05:32:53.714643 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" containerName="installer" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.714651 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" containerName="installer" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.714756 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca75b92-342d-46ef-8307-92efc0200a55" containerName="oauth-openshift" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.714774 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5dc3a2-9f22-4658-9ff1-e3d30f635a18" containerName="installer" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.715230 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.716954 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.717267 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.718941 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.719036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.719275 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.719298 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.719312 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.719309 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.724396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.724419 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.724562 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.725070 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.731812 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.733233 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.737783 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dc7444945-bll47"] Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.740086 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.798541 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.836764 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838836 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838891 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69f142af-62c3-4d29-8870-be92b4c7216d-audit-dir\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-session\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838931 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838948 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79lz\" (UniqueName: \"kubernetes.io/projected/69f142af-62c3-4d29-8870-be92b4c7216d-kube-api-access-b79lz\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.838988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-audit-policies\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839047 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839070 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839083 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.839117 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.849771 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940695 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940765 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940797 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940821 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940850 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940924 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69f142af-62c3-4d29-8870-be92b4c7216d-audit-dir\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-session\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940969 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.940986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79lz\" (UniqueName: \"kubernetes.io/projected/69f142af-62c3-4d29-8870-be92b4c7216d-kube-api-access-b79lz\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.941006 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.941026 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-audit-policies\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.941067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.941437 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69f142af-62c3-4d29-8870-be92b4c7216d-audit-dir\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.942374 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-audit-policies\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.942391 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.942634 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.943110 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.946893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.946932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-session\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.947306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.947614 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.948063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.948989 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.949335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.949948 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69f142af-62c3-4d29-8870-be92b4c7216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.961038 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79lz\" (UniqueName: \"kubernetes.io/projected/69f142af-62c3-4d29-8870-be92b4c7216d-kube-api-access-b79lz\") pod \"oauth-openshift-7dc7444945-bll47\" (UID: \"69f142af-62c3-4d29-8870-be92b4c7216d\") " pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:53 crc kubenswrapper[4713]: I0314 05:32:53.971446 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.017354 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.041712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.137987 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.176061 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.179014 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.194136 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.224425 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.266429 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.283720 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.310541 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.380583 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.401117 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.401375 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0" gracePeriod=5 Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.417304 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.563637 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.618580 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.636656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.691361 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.734435 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.798279 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.809641 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.861028 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.879429 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.950039 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.953512 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 05:32:54 crc kubenswrapper[4713]: I0314 05:32:54.992482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.019320 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.161466 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.184897 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.187944 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.220153 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.295454 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.327138 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.456986 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.466848 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.527088 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.527928 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.552269 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.634982 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.636906 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.695433 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.720865 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.834849 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.853778 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.908807 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.943528 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.966161 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 05:32:55 crc kubenswrapper[4713]: I0314 05:32:55.984681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.101131 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.161082 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.336147 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.462000 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.498298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.536612 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.639692 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.640563 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.680961 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.783473 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.817522 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.855485 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.907226 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.956049 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 05:32:56 crc kubenswrapper[4713]: I0314 05:32:56.965320 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.124564 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 05:32:57 crc kubenswrapper[4713]: E0314 05:32:57.153473 4713 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 14 05:32:57 crc kubenswrapper[4713]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc7444945-bll47_openshift-authentication_69f142af-62c3-4d29-8870-be92b4c7216d_0(3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad): error adding pod openshift-authentication_oauth-openshift-7dc7444945-bll47 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad" Netns:"/var/run/netns/c306b5ca-38ed-4e9d-8e21-c5404bc1b52d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc7444945-bll47;K8S_POD_INFRA_CONTAINER_ID=3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad;K8S_POD_UID=69f142af-62c3-4d29-8870-be92b4c7216d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc7444945-bll47] networking: Multus: [openshift-authentication/oauth-openshift-7dc7444945-bll47/69f142af-62c3-4d29-8870-be92b4c7216d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc7444945-bll47 in out of cluster comm: pod "oauth-openshift-7dc7444945-bll47" not found Mar 14 05:32:57 crc kubenswrapper[4713]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 05:32:57 crc kubenswrapper[4713]: > Mar 14 05:32:57 crc kubenswrapper[4713]: E0314 05:32:57.153541 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 14 05:32:57 crc kubenswrapper[4713]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc7444945-bll47_openshift-authentication_69f142af-62c3-4d29-8870-be92b4c7216d_0(3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad): error adding pod openshift-authentication_oauth-openshift-7dc7444945-bll47 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad" Netns:"/var/run/netns/c306b5ca-38ed-4e9d-8e21-c5404bc1b52d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc7444945-bll47;K8S_POD_INFRA_CONTAINER_ID=3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad;K8S_POD_UID=69f142af-62c3-4d29-8870-be92b4c7216d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc7444945-bll47] networking: Multus: [openshift-authentication/oauth-openshift-7dc7444945-bll47/69f142af-62c3-4d29-8870-be92b4c7216d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc7444945-bll47 in out of cluster comm: pod "oauth-openshift-7dc7444945-bll47" not found Mar 14 05:32:57 crc kubenswrapper[4713]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 05:32:57 crc kubenswrapper[4713]: > pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:57 crc kubenswrapper[4713]: E0314 05:32:57.153564 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 14 05:32:57 crc kubenswrapper[4713]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc7444945-bll47_openshift-authentication_69f142af-62c3-4d29-8870-be92b4c7216d_0(3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad): error adding pod openshift-authentication_oauth-openshift-7dc7444945-bll47 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad" Netns:"/var/run/netns/c306b5ca-38ed-4e9d-8e21-c5404bc1b52d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc7444945-bll47;K8S_POD_INFRA_CONTAINER_ID=3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad;K8S_POD_UID=69f142af-62c3-4d29-8870-be92b4c7216d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc7444945-bll47] networking: Multus: [openshift-authentication/oauth-openshift-7dc7444945-bll47/69f142af-62c3-4d29-8870-be92b4c7216d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc7444945-bll47 in out of cluster comm: pod "oauth-openshift-7dc7444945-bll47" not found Mar 14 05:32:57 crc kubenswrapper[4713]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 05:32:57 crc kubenswrapper[4713]: > pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:57 crc kubenswrapper[4713]: E0314 05:32:57.153612 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7dc7444945-bll47_openshift-authentication(69f142af-62c3-4d29-8870-be92b4c7216d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7dc7444945-bll47_openshift-authentication(69f142af-62c3-4d29-8870-be92b4c7216d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc7444945-bll47_openshift-authentication_69f142af-62c3-4d29-8870-be92b4c7216d_0(3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad): error adding pod openshift-authentication_oauth-openshift-7dc7444945-bll47 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad\\\" Netns:\\\"/var/run/netns/c306b5ca-38ed-4e9d-8e21-c5404bc1b52d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc7444945-bll47;K8S_POD_INFRA_CONTAINER_ID=3c46165abde3beb218ea93de9bb75c0bd81d7578b1a8beb5f771bb8e206814ad;K8S_POD_UID=69f142af-62c3-4d29-8870-be92b4c7216d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc7444945-bll47] networking: Multus: [openshift-authentication/oauth-openshift-7dc7444945-bll47/69f142af-62c3-4d29-8870-be92b4c7216d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc7444945-bll47 in out of cluster comm: pod \\\"oauth-openshift-7dc7444945-bll47\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.173195 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.177771 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.250152 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.324766 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.493311 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.579936 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 05:32:57 crc kubenswrapper[4713]: I0314 05:32:57.998115 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.081596 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.111048 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.119036 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.119933 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.274489 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.354223 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.483675 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.650440 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.711306 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 05:32:58 crc kubenswrapper[4713]: I0314 05:32:58.711879 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.164445 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.212403 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.425592 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.502939 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.890670 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.999146 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 05:32:59 crc kubenswrapper[4713]: I0314 05:32:59.999290 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.001644 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.061661 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121499 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121605 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121645 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121699 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121768 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.121991 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.122077 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.122138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.122178 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.131483 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.133283 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.133341 4713 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0" exitCode=137 Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.133383 4713 scope.go:117] "RemoveContainer" containerID="75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.133454 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.195039 4713 scope.go:117] "RemoveContainer" containerID="75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0" Mar 14 05:33:00 crc kubenswrapper[4713]: E0314 05:33:00.195465 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0\": container with ID starting with 75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0 not found: ID does not exist" containerID="75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.195514 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0"} err="failed to get container status \"75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0\": rpc error: code = NotFound desc = could not find container \"75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0\": container with ID starting with 75729c671e9ecf93bdf0e256f0dffbc5bb1161e94cb2277874e1b93a6cc4bfb0 not found: ID does not exist" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.201844 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.223436 4713 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.223471 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.223484 4713 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.223496 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.223507 4713 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.639340 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 05:33:00 crc kubenswrapper[4713]: I0314 05:33:00.880361 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dc7444945-bll47"] Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.141219 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" event={"ID":"69f142af-62c3-4d29-8870-be92b4c7216d","Type":"ContainerStarted","Data":"ef5af81019e2cc410dddb8817db8a366260075d08aa35fcae4eb5912ea7b56a4"} Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.141267 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" event={"ID":"69f142af-62c3-4d29-8870-be92b4c7216d","Type":"ContainerStarted","Data":"b2c01caece37125f1830fb1f96fbd2fb1f20945039dc29ecad9dd6d7105676b7"} Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.141610 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.142819 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": dial tcp 10.217.0.69:6443: connect: connection refused" start-of-body= Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.142859 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": dial tcp 10.217.0.69:6443: connect: connection refused" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.170497 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podStartSLOduration=62.170453273 podStartE2EDuration="1m2.170453273s" podCreationTimestamp="2026-03-14 05:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:33:01.168854942 +0000 UTC m=+364.256764242" watchObservedRunningTime="2026-03-14 05:33:01.170453273 +0000 UTC m=+364.258362613" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.580792 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.581848 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.597067 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.597109 4713 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8dc942a3-6a63-44f2-9156-45bca97a309c" Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.603520 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 05:33:01 crc kubenswrapper[4713]: I0314 05:33:01.603577 4713 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8dc942a3-6a63-44f2-9156-45bca97a309c" Mar 14 05:33:02 crc kubenswrapper[4713]: I0314 05:33:02.157969 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 05:33:18 crc kubenswrapper[4713]: I0314 05:33:18.268516 4713 generic.go:334] "Generic (PLEG): container finished" podID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerID="41f262140bd1b3146b013df15eab18ec3370e0bdcd4a75645ef2600b9cf271cd" exitCode=0 Mar 14 05:33:18 crc kubenswrapper[4713]: I0314 05:33:18.268593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerDied","Data":"41f262140bd1b3146b013df15eab18ec3370e0bdcd4a75645ef2600b9cf271cd"} Mar 14 05:33:18 crc kubenswrapper[4713]: I0314 05:33:18.270028 4713 scope.go:117] "RemoveContainer" containerID="41f262140bd1b3146b013df15eab18ec3370e0bdcd4a75645ef2600b9cf271cd" Mar 14 05:33:19 crc kubenswrapper[4713]: I0314 05:33:19.279741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerStarted","Data":"34c9aa496b2123d399e7beb414b156db14ab9112aadfb1e8ecb695949d3fd4fe"} Mar 14 05:33:19 crc kubenswrapper[4713]: I0314 05:33:19.280222 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:33:19 crc kubenswrapper[4713]: I0314 05:33:19.281144 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.291099 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.292148 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.294417 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.294477 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="37dba197380ae2f1736b9692ed90cc69f0f419a93574b3897b13b3add1bee015" exitCode=137 Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.294584 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"37dba197380ae2f1736b9692ed90cc69f0f419a93574b3897b13b3add1bee015"} Mar 14 05:33:20 crc kubenswrapper[4713]: I0314 05:33:20.294667 4713 scope.go:117] "RemoveContainer" containerID="b2b180c1d9cb12d2671ac8458febd65d67666e67306dfc203cc8890e2cb5b15b" Mar 14 05:33:21 crc kubenswrapper[4713]: I0314 05:33:21.301123 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 14 05:33:21 crc kubenswrapper[4713]: I0314 05:33:21.302166 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 05:33:21 crc kubenswrapper[4713]: I0314 05:33:21.302760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e121548ea30a778be4c565982788083b8c1437b509fa6234b67bdc60d82cd8c"} Mar 14 05:33:29 crc kubenswrapper[4713]: I0314 05:33:29.673539 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:33:29 crc kubenswrapper[4713]: I0314 05:33:29.679082 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:33:30 crc kubenswrapper[4713]: I0314 05:33:30.333430 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:33:30 crc kubenswrapper[4713]: I0314 05:33:30.338121 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.177072 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557774-hl9gk"] Mar 14 05:34:00 crc kubenswrapper[4713]: E0314 05:34:00.177802 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.177817 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.177936 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.178382 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.180599 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.181342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.181502 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.190894 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557774-hl9gk"] Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.239920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr24t\" (UniqueName: \"kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t\") pod \"auto-csr-approver-29557774-hl9gk\" (UID: \"ce66bc3f-1fe6-495b-8d7c-7caaea776d81\") " pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.340933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr24t\" (UniqueName: \"kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t\") pod \"auto-csr-approver-29557774-hl9gk\" (UID: \"ce66bc3f-1fe6-495b-8d7c-7caaea776d81\") " pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.359933 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr24t\" (UniqueName: \"kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t\") pod \"auto-csr-approver-29557774-hl9gk\" (UID: \"ce66bc3f-1fe6-495b-8d7c-7caaea776d81\") " pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.493563 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:00 crc kubenswrapper[4713]: I0314 05:34:00.734878 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557774-hl9gk"] Mar 14 05:34:01 crc kubenswrapper[4713]: I0314 05:34:01.533611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" event={"ID":"ce66bc3f-1fe6-495b-8d7c-7caaea776d81","Type":"ContainerStarted","Data":"a7c3935f7d50807bad1d2dea4544d262f4cefe962b1f794464b88c0861516534"} Mar 14 05:34:02 crc kubenswrapper[4713]: I0314 05:34:02.538843 4713 generic.go:334] "Generic (PLEG): container finished" podID="ce66bc3f-1fe6-495b-8d7c-7caaea776d81" containerID="27e961efd45e801c359322a0df11b6c7d2d21836a53017b4a09d8777d9b87c5b" exitCode=0 Mar 14 05:34:02 crc kubenswrapper[4713]: I0314 05:34:02.538907 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" event={"ID":"ce66bc3f-1fe6-495b-8d7c-7caaea776d81","Type":"ContainerDied","Data":"27e961efd45e801c359322a0df11b6c7d2d21836a53017b4a09d8777d9b87c5b"} Mar 14 05:34:03 crc kubenswrapper[4713]: I0314 05:34:03.796489 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:03 crc kubenswrapper[4713]: I0314 05:34:03.886625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr24t\" (UniqueName: \"kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t\") pod \"ce66bc3f-1fe6-495b-8d7c-7caaea776d81\" (UID: \"ce66bc3f-1fe6-495b-8d7c-7caaea776d81\") " Mar 14 05:34:03 crc kubenswrapper[4713]: I0314 05:34:03.892073 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t" (OuterVolumeSpecName: "kube-api-access-kr24t") pod "ce66bc3f-1fe6-495b-8d7c-7caaea776d81" (UID: "ce66bc3f-1fe6-495b-8d7c-7caaea776d81"). InnerVolumeSpecName "kube-api-access-kr24t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:03 crc kubenswrapper[4713]: I0314 05:34:03.988197 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr24t\" (UniqueName: \"kubernetes.io/projected/ce66bc3f-1fe6-495b-8d7c-7caaea776d81-kube-api-access-kr24t\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:04 crc kubenswrapper[4713]: I0314 05:34:04.550437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" event={"ID":"ce66bc3f-1fe6-495b-8d7c-7caaea776d81","Type":"ContainerDied","Data":"a7c3935f7d50807bad1d2dea4544d262f4cefe962b1f794464b88c0861516534"} Mar 14 05:34:04 crc kubenswrapper[4713]: I0314 05:34:04.550752 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c3935f7d50807bad1d2dea4544d262f4cefe962b1f794464b88c0861516534" Mar 14 05:34:04 crc kubenswrapper[4713]: I0314 05:34:04.550802 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557774-hl9gk" Mar 14 05:34:10 crc kubenswrapper[4713]: I0314 05:34:10.731629 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:34:10 crc kubenswrapper[4713]: I0314 05:34:10.734638 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.384666 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-42m74"] Mar 14 05:34:34 crc kubenswrapper[4713]: E0314 05:34:34.385351 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce66bc3f-1fe6-495b-8d7c-7caaea776d81" containerName="oc" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.385363 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce66bc3f-1fe6-495b-8d7c-7caaea776d81" containerName="oc" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.385459 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce66bc3f-1fe6-495b-8d7c-7caaea776d81" containerName="oc" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.385800 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.398083 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-42m74"] Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-tls\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496640 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48af07f5-ccc8-4396-8c9a-f9b683001cb2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-certificates\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496747 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496775 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvj7\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-kube-api-access-5fvj7\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496792 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48af07f5-ccc8-4396-8c9a-f9b683001cb2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496808 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-trusted-ca\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.496824 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-bound-sa-token\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-tls\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685743 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48af07f5-ccc8-4396-8c9a-f9b683001cb2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685827 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-certificates\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvj7\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-kube-api-access-5fvj7\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685925 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48af07f5-ccc8-4396-8c9a-f9b683001cb2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-trusted-ca\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.685971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-bound-sa-token\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.687043 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48af07f5-ccc8-4396-8c9a-f9b683001cb2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.687530 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-certificates\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.688413 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af07f5-ccc8-4396-8c9a-f9b683001cb2-trusted-ca\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.696887 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48af07f5-ccc8-4396-8c9a-f9b683001cb2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.710827 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-bound-sa-token\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.712198 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-registry-tls\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.713524 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvj7\" (UniqueName: \"kubernetes.io/projected/48af07f5-ccc8-4396-8c9a-f9b683001cb2-kube-api-access-5fvj7\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:34 crc kubenswrapper[4713]: I0314 05:34:34.725696 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-42m74\" (UID: \"48af07f5-ccc8-4396-8c9a-f9b683001cb2\") " pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.003969 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.453762 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-42m74"] Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.745149 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" event={"ID":"48af07f5-ccc8-4396-8c9a-f9b683001cb2","Type":"ContainerStarted","Data":"22acb0e7aa387a5948acff23ea5d236ef27b22b75e8c26e2964e400e9f3c6d2f"} Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.745262 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" event={"ID":"48af07f5-ccc8-4396-8c9a-f9b683001cb2","Type":"ContainerStarted","Data":"83684b7b19ecd16603a690f642ecdbb5841c9a31cd60d57760504cc0bc4c2f5f"} Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.745704 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:35 crc kubenswrapper[4713]: I0314 05:34:35.775565 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" podStartSLOduration=1.775534113 podStartE2EDuration="1.775534113s" podCreationTimestamp="2026-03-14 05:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:34:35.77226371 +0000 UTC m=+458.860173010" watchObservedRunningTime="2026-03-14 05:34:35.775534113 +0000 UTC m=+458.863443413" Mar 14 05:34:40 crc kubenswrapper[4713]: I0314 05:34:40.731279 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:34:40 crc kubenswrapper[4713]: I0314 05:34:40.731987 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.051247 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.053256 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbxl4" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="registry-server" containerID="cri-o://dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be" gracePeriod=30 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.068387 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.069037 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jncw4" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="registry-server" containerID="cri-o://88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed" gracePeriod=30 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.075684 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.075951 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" containerID="cri-o://34c9aa496b2123d399e7beb414b156db14ab9112aadfb1e8ecb695949d3fd4fe" gracePeriod=30 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.082719 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.083010 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9zt2" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="registry-server" containerID="cri-o://dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b" gracePeriod=30 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.093666 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.094398 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dvtpw" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="registry-server" containerID="cri-o://73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427" gracePeriod=30 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.098001 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzczw"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.098690 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.122055 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzczw"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.268338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.268858 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.268901 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svfs4\" (UniqueName: \"kubernetes.io/projected/2c50b2f7-7be4-4125-94ac-525d908a9e86-kube-api-access-svfs4\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.369722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.369771 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svfs4\" (UniqueName: \"kubernetes.io/projected/2c50b2f7-7be4-4125-94ac-525d908a9e86-kube-api-access-svfs4\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.369825 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.371874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.376252 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c50b2f7-7be4-4125-94ac-525d908a9e86-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.390135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svfs4\" (UniqueName: \"kubernetes.io/projected/2c50b2f7-7be4-4125-94ac-525d908a9e86-kube-api-access-svfs4\") pod \"marketplace-operator-79b997595-bzczw\" (UID: \"2c50b2f7-7be4-4125-94ac-525d908a9e86\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.518381 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.527738 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.531964 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.536110 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.544520 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672789 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities\") pod \"99ba127f-5518-4d4e-9581-10970dcb998c\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672830 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fpgn\" (UniqueName: \"kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn\") pod \"c49b0182-cb22-4f55-b7a8-893646fa21fe\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672858 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gvzz\" (UniqueName: \"kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz\") pod \"99ba127f-5518-4d4e-9581-10970dcb998c\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672882 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities\") pod \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672926 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content\") pod \"c49b0182-cb22-4f55-b7a8-893646fa21fe\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672951 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities\") pod \"c49b0182-cb22-4f55-b7a8-893646fa21fe\" (UID: \"c49b0182-cb22-4f55-b7a8-893646fa21fe\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672965 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities\") pod \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.672983 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmxt\" (UniqueName: \"kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt\") pod \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.673007 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content\") pod \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.673029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content\") pod \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\" (UID: \"2bdf5393-1e5e-4965-a24c-b45a22c6053e\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.673054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p4j2\" (UniqueName: \"kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2\") pod \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\" (UID: \"caec2fda-bb55-4f4f-a487-8deeb4bf5da4\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.673080 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content\") pod \"99ba127f-5518-4d4e-9581-10970dcb998c\" (UID: \"99ba127f-5518-4d4e-9581-10970dcb998c\") " Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.673597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities" (OuterVolumeSpecName: "utilities") pod "99ba127f-5518-4d4e-9581-10970dcb998c" (UID: "99ba127f-5518-4d4e-9581-10970dcb998c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.674123 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities" (OuterVolumeSpecName: "utilities") pod "c49b0182-cb22-4f55-b7a8-893646fa21fe" (UID: "c49b0182-cb22-4f55-b7a8-893646fa21fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.674384 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.674406 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.674425 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities" (OuterVolumeSpecName: "utilities") pod "caec2fda-bb55-4f4f-a487-8deeb4bf5da4" (UID: "caec2fda-bb55-4f4f-a487-8deeb4bf5da4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.676607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2" (OuterVolumeSpecName: "kube-api-access-7p4j2") pod "caec2fda-bb55-4f4f-a487-8deeb4bf5da4" (UID: "caec2fda-bb55-4f4f-a487-8deeb4bf5da4"). InnerVolumeSpecName "kube-api-access-7p4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.676994 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt" (OuterVolumeSpecName: "kube-api-access-8wmxt") pod "2bdf5393-1e5e-4965-a24c-b45a22c6053e" (UID: "2bdf5393-1e5e-4965-a24c-b45a22c6053e"). InnerVolumeSpecName "kube-api-access-8wmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.683813 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities" (OuterVolumeSpecName: "utilities") pod "2bdf5393-1e5e-4965-a24c-b45a22c6053e" (UID: "2bdf5393-1e5e-4965-a24c-b45a22c6053e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.692447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz" (OuterVolumeSpecName: "kube-api-access-9gvzz") pod "99ba127f-5518-4d4e-9581-10970dcb998c" (UID: "99ba127f-5518-4d4e-9581-10970dcb998c"). InnerVolumeSpecName "kube-api-access-9gvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.693102 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn" (OuterVolumeSpecName: "kube-api-access-2fpgn") pod "c49b0182-cb22-4f55-b7a8-893646fa21fe" (UID: "c49b0182-cb22-4f55-b7a8-893646fa21fe"). InnerVolumeSpecName "kube-api-access-2fpgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.705609 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bdf5393-1e5e-4965-a24c-b45a22c6053e" (UID: "2bdf5393-1e5e-4965-a24c-b45a22c6053e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.736962 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c49b0182-cb22-4f55-b7a8-893646fa21fe" (UID: "c49b0182-cb22-4f55-b7a8-893646fa21fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.754637 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzczw"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.763836 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caec2fda-bb55-4f4f-a487-8deeb4bf5da4" (UID: "caec2fda-bb55-4f4f-a487-8deeb4bf5da4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775083 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775101 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775111 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p4j2\" (UniqueName: \"kubernetes.io/projected/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-kube-api-access-7p4j2\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775130 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fpgn\" (UniqueName: \"kubernetes.io/projected/c49b0182-cb22-4f55-b7a8-893646fa21fe-kube-api-access-2fpgn\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775140 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gvzz\" (UniqueName: \"kubernetes.io/projected/99ba127f-5518-4d4e-9581-10970dcb998c-kube-api-access-9gvzz\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775150 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2fda-bb55-4f4f-a487-8deeb4bf5da4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775160 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c49b0182-cb22-4f55-b7a8-893646fa21fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775167 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bdf5393-1e5e-4965-a24c-b45a22c6053e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.775175 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmxt\" (UniqueName: \"kubernetes.io/projected/2bdf5393-1e5e-4965-a24c-b45a22c6053e-kube-api-access-8wmxt\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.820745 4713 generic.go:334] "Generic (PLEG): container finished" podID="99ba127f-5518-4d4e-9581-10970dcb998c" containerID="73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427" exitCode=0 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.820798 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerDied","Data":"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.820824 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvtpw" event={"ID":"99ba127f-5518-4d4e-9581-10970dcb998c","Type":"ContainerDied","Data":"b5bfbbab6467e14cbc88c8a8bbb5ca5f278e232b2034c14b871230e85131ff16"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.820855 4713 scope.go:117] "RemoveContainer" containerID="73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.820904 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvtpw" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.822158 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" event={"ID":"2c50b2f7-7be4-4125-94ac-525d908a9e86","Type":"ContainerStarted","Data":"5b1ef2b0827172037804a7b939b79dffc764180d4c61386326d44ca19d0083db"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.822422 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99ba127f-5518-4d4e-9581-10970dcb998c" (UID: "99ba127f-5518-4d4e-9581-10970dcb998c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.840740 4713 scope.go:117] "RemoveContainer" containerID="b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.842778 4713 generic.go:334] "Generic (PLEG): container finished" podID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerID="88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed" exitCode=0 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.842895 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncw4" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.843320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerDied","Data":"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.843349 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncw4" event={"ID":"c49b0182-cb22-4f55-b7a8-893646fa21fe","Type":"ContainerDied","Data":"d63598bb0b7f68069e44c4c93c9ab911a55ec34b2015efdf19ae9fe4cd1981e2"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.854127 4713 generic.go:334] "Generic (PLEG): container finished" podID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerID="34c9aa496b2123d399e7beb414b156db14ab9112aadfb1e8ecb695949d3fd4fe" exitCode=0 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.854197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerDied","Data":"34c9aa496b2123d399e7beb414b156db14ab9112aadfb1e8ecb695949d3fd4fe"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.858181 4713 generic.go:334] "Generic (PLEG): container finished" podID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerID="dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be" exitCode=0 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.858241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerDied","Data":"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.858262 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbxl4" event={"ID":"caec2fda-bb55-4f4f-a487-8deeb4bf5da4","Type":"ContainerDied","Data":"1fd34dcb4163de645994fa134aa019d806a4ccec8f942ea4f0e12295740e181d"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.858313 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbxl4" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.861495 4713 generic.go:334] "Generic (PLEG): container finished" podID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerID="dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b" exitCode=0 Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.861536 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerDied","Data":"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.861565 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9zt2" event={"ID":"2bdf5393-1e5e-4965-a24c-b45a22c6053e","Type":"ContainerDied","Data":"b0719048e481c066e911195f00ff58b9115f1e0ba83461bebb18393dab85d3ff"} Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.861636 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9zt2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.877773 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99ba127f-5518-4d4e-9581-10970dcb998c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.890623 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.893057 4713 scope.go:117] "RemoveContainer" containerID="94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.893589 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jncw4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.915196 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.918841 4713 scope.go:117] "RemoveContainer" containerID="73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.919392 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427\": container with ID starting with 73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427 not found: ID does not exist" containerID="73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.919428 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427"} err="failed to get container status \"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427\": rpc error: code = NotFound desc = could not find container \"73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427\": container with ID starting with 73cdbcdeb25d23aaa95fb6a826d9cdce643594ea369551efcf5d1bc1a9dd8427 not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.919448 4713 scope.go:117] "RemoveContainer" containerID="b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.919924 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2\": container with ID starting with b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2 not found: ID does not exist" containerID="b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.919955 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2"} err="failed to get container status \"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2\": rpc error: code = NotFound desc = could not find container \"b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2\": container with ID starting with b091e765edc4df97a2f6f64515dd0ca17625857fa4a6e2473baa75933f3152e2 not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.919980 4713 scope.go:117] "RemoveContainer" containerID="94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.920394 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452\": container with ID starting with 94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452 not found: ID does not exist" containerID="94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.920415 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452"} err="failed to get container status \"94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452\": rpc error: code = NotFound desc = could not find container \"94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452\": container with ID starting with 94ab9badcc19d8555251afbd0c2b1a252688fe5018c33062a71ccf105b6c9452 not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.920431 4713 scope.go:117] "RemoveContainer" containerID="88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.921757 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbxl4"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.921949 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.926587 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.929352 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9zt2"] Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.962293 4713 scope.go:117] "RemoveContainer" containerID="3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.979170 4713 scope.go:117] "RemoveContainer" containerID="6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.992380 4713 scope.go:117] "RemoveContainer" containerID="88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.992788 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed\": container with ID starting with 88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed not found: ID does not exist" containerID="88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.992818 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed"} err="failed to get container status \"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed\": rpc error: code = NotFound desc = could not find container \"88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed\": container with ID starting with 88952ad13cd8c24e55cef3f6154c1e02eb72a7328b038a23bb5cd48819b851ed not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.992842 4713 scope.go:117] "RemoveContainer" containerID="3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.993136 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7\": container with ID starting with 3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7 not found: ID does not exist" containerID="3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.993156 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7"} err="failed to get container status \"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7\": rpc error: code = NotFound desc = could not find container \"3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7\": container with ID starting with 3007a2de90dd6895201a09faef2128030a6f799cfa604ae716f73ad4e671acb7 not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.993169 4713 scope.go:117] "RemoveContainer" containerID="6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2" Mar 14 05:34:48 crc kubenswrapper[4713]: E0314 05:34:48.993469 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2\": container with ID starting with 6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2 not found: ID does not exist" containerID="6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.993489 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2"} err="failed to get container status \"6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2\": rpc error: code = NotFound desc = could not find container \"6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2\": container with ID starting with 6d956837c0a4a0063894a8e87f8d5417a7144451debeba618e20352bfa1df4e2 not found: ID does not exist" Mar 14 05:34:48 crc kubenswrapper[4713]: I0314 05:34:48.993503 4713 scope.go:117] "RemoveContainer" containerID="41f262140bd1b3146b013df15eab18ec3370e0bdcd4a75645ef2600b9cf271cd" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.012156 4713 scope.go:117] "RemoveContainer" containerID="dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.027123 4713 scope.go:117] "RemoveContainer" containerID="5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.040269 4713 scope.go:117] "RemoveContainer" containerID="15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.055162 4713 scope.go:117] "RemoveContainer" containerID="dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.056093 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be\": container with ID starting with dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be not found: ID does not exist" containerID="dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.056130 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be"} err="failed to get container status \"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be\": rpc error: code = NotFound desc = could not find container \"dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be\": container with ID starting with dea2c8a6ddf3fecfc80874f1fda7336ed9337d332720d9963d90c9271b0fc9be not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.056160 4713 scope.go:117] "RemoveContainer" containerID="5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.056476 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690\": container with ID starting with 5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690 not found: ID does not exist" containerID="5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.056499 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690"} err="failed to get container status \"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690\": rpc error: code = NotFound desc = could not find container \"5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690\": container with ID starting with 5f3291ee4c5141256ceb4e6c3429c61d311dacafb3d772a50f83f7ba142f9690 not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.056512 4713 scope.go:117] "RemoveContainer" containerID="15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.057089 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69\": container with ID starting with 15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69 not found: ID does not exist" containerID="15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.057111 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69"} err="failed to get container status \"15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69\": rpc error: code = NotFound desc = could not find container \"15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69\": container with ID starting with 15936d36a4237c8804dc8891663db723528f19660ba5999e92304ccdfbf32f69 not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.057124 4713 scope.go:117] "RemoveContainer" containerID="dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.074657 4713 scope.go:117] "RemoveContainer" containerID="714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.079798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca\") pod \"068aebba-22ff-46cd-856c-e85d409e0ae5\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.079862 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics\") pod \"068aebba-22ff-46cd-856c-e85d409e0ae5\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.079903 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9gcv\" (UniqueName: \"kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv\") pod \"068aebba-22ff-46cd-856c-e85d409e0ae5\" (UID: \"068aebba-22ff-46cd-856c-e85d409e0ae5\") " Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.080486 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "068aebba-22ff-46cd-856c-e85d409e0ae5" (UID: "068aebba-22ff-46cd-856c-e85d409e0ae5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.084434 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "068aebba-22ff-46cd-856c-e85d409e0ae5" (UID: "068aebba-22ff-46cd-856c-e85d409e0ae5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.084510 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv" (OuterVolumeSpecName: "kube-api-access-z9gcv") pod "068aebba-22ff-46cd-856c-e85d409e0ae5" (UID: "068aebba-22ff-46cd-856c-e85d409e0ae5"). InnerVolumeSpecName "kube-api-access-z9gcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.088369 4713 scope.go:117] "RemoveContainer" containerID="585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.103836 4713 scope.go:117] "RemoveContainer" containerID="dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.104269 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b\": container with ID starting with dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b not found: ID does not exist" containerID="dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.104297 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b"} err="failed to get container status \"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b\": rpc error: code = NotFound desc = could not find container \"dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b\": container with ID starting with dbebfa6d02a29132e44abc1e04c81960aedd9c4378801985bf9aabb94cf9df5b not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.104318 4713 scope.go:117] "RemoveContainer" containerID="714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.104583 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77\": container with ID starting with 714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77 not found: ID does not exist" containerID="714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.104643 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77"} err="failed to get container status \"714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77\": rpc error: code = NotFound desc = could not find container \"714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77\": container with ID starting with 714be6848add5a51b1ebabdbbd1137da4cef77cfb914d45a62eaac99752c1b77 not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.104694 4713 scope.go:117] "RemoveContainer" containerID="585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5" Mar 14 05:34:49 crc kubenswrapper[4713]: E0314 05:34:49.105109 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5\": container with ID starting with 585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5 not found: ID does not exist" containerID="585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.105168 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5"} err="failed to get container status \"585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5\": rpc error: code = NotFound desc = could not find container \"585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5\": container with ID starting with 585f584676d722ef45320773c83fca1493e3356c6301bc53fa1492967d1f09c5 not found: ID does not exist" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.148546 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.151715 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dvtpw"] Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.180983 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.181008 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/068aebba-22ff-46cd-856c-e85d409e0ae5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.181017 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9gcv\" (UniqueName: \"kubernetes.io/projected/068aebba-22ff-46cd-856c-e85d409e0ae5-kube-api-access-z9gcv\") on node \"crc\" DevicePath \"\"" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.569574 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" path="/var/lib/kubelet/pods/2bdf5393-1e5e-4965-a24c-b45a22c6053e/volumes" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.570789 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" path="/var/lib/kubelet/pods/99ba127f-5518-4d4e-9581-10970dcb998c/volumes" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.571612 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" path="/var/lib/kubelet/pods/c49b0182-cb22-4f55-b7a8-893646fa21fe/volumes" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.573476 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" path="/var/lib/kubelet/pods/caec2fda-bb55-4f4f-a487-8deeb4bf5da4/volumes" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.867271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" event={"ID":"2c50b2f7-7be4-4125-94ac-525d908a9e86","Type":"ContainerStarted","Data":"4219ace6ab4a505773687802845605d309c8f976fe6425226629dd1bf9f40ae8"} Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.867523 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.869647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" event={"ID":"068aebba-22ff-46cd-856c-e85d409e0ae5","Type":"ContainerDied","Data":"145b8bd8e2da8692d69016dd984b559d4e2eb2a2b74b30257b1ff536d1429536"} Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.869654 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mw4tj" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.869687 4713 scope.go:117] "RemoveContainer" containerID="34c9aa496b2123d399e7beb414b156db14ab9112aadfb1e8ecb695949d3fd4fe" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.870359 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.888188 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" podStartSLOduration=1.888169364 podStartE2EDuration="1.888169364s" podCreationTimestamp="2026-03-14 05:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:34:49.885833154 +0000 UTC m=+472.973742464" watchObservedRunningTime="2026-03-14 05:34:49.888169364 +0000 UTC m=+472.976078664" Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.955120 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:34:49 crc kubenswrapper[4713]: I0314 05:34:49.955175 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mw4tj"] Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275047 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qhnqn"] Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275281 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275296 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275305 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275312 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275324 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275331 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275339 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275345 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275360 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275366 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275374 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275381 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275391 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275406 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275414 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="extract-content" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275423 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275430 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275442 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275448 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275456 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275463 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="extract-utilities" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275471 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275478 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275491 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275497 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275606 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275616 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ba127f-5518-4d4e-9581-10970dcb998c" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275625 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49b0182-cb22-4f55-b7a8-893646fa21fe" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275638 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="caec2fda-bb55-4f4f-a487-8deeb4bf5da4" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275647 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdf5393-1e5e-4965-a24c-b45a22c6053e" containerName="registry-server" Mar 14 05:34:50 crc kubenswrapper[4713]: E0314 05:34:50.275748 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275756 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.275865 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" containerName="marketplace-operator" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.276578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.278557 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.292670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhnqn"] Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.343228 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-utilities\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.343276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-catalog-content\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.343335 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kqn\" (UniqueName: \"kubernetes.io/projected/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-kube-api-access-58kqn\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.444128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-utilities\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.444186 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-catalog-content\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.444262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kqn\" (UniqueName: \"kubernetes.io/projected/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-kube-api-access-58kqn\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.444908 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-utilities\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.444982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-catalog-content\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.464761 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kqn\" (UniqueName: \"kubernetes.io/projected/fc4d2c5d-cf64-489f-9229-3e79a6e369c3-kube-api-access-58kqn\") pod \"redhat-marketplace-qhnqn\" (UID: \"fc4d2c5d-cf64-489f-9229-3e79a6e369c3\") " pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.479198 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-525cq"] Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.480604 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.483528 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.501812 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-525cq"] Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.643632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.646506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-utilities\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.646649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-catalog-content\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.646686 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfs2s\" (UniqueName: \"kubernetes.io/projected/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-kube-api-access-cfs2s\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.748889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-catalog-content\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.749301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfs2s\" (UniqueName: \"kubernetes.io/projected/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-kube-api-access-cfs2s\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:50 crc kubenswrapper[4713]: I0314 05:34:50.749402 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-utilities\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:50.749780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-catalog-content\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:50.749928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-utilities\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:50.772373 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfs2s\" (UniqueName: \"kubernetes.io/projected/d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd-kube-api-access-cfs2s\") pod \"redhat-operators-525cq\" (UID: \"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd\") " pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:50.803159 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:51.573107 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068aebba-22ff-46cd-856c-e85d409e0ae5" path="/var/lib/kubelet/pods/068aebba-22ff-46cd-856c-e85d409e0ae5/volumes" Mar 14 05:34:51 crc kubenswrapper[4713]: I0314 05:34:51.998015 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-525cq"] Mar 14 05:34:52 crc kubenswrapper[4713]: W0314 05:34:52.002494 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd675a43d_ebc9_4ad8_92c6_89ecb59ea8fd.slice/crio-45bd8e196d437bf814df8c5324cc0d7d99216575aee1f8127b19e67da4b9a899 WatchSource:0}: Error finding container 45bd8e196d437bf814df8c5324cc0d7d99216575aee1f8127b19e67da4b9a899: Status 404 returned error can't find the container with id 45bd8e196d437bf814df8c5324cc0d7d99216575aee1f8127b19e67da4b9a899 Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.005333 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qhnqn"] Mar 14 05:34:52 crc kubenswrapper[4713]: W0314 05:34:52.011311 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4d2c5d_cf64_489f_9229_3e79a6e369c3.slice/crio-9550dd6bff329a3a9f8e8b1a6f5854799ee592651a4d6c793e7bd3ef49fa5d08 WatchSource:0}: Error finding container 9550dd6bff329a3a9f8e8b1a6f5854799ee592651a4d6c793e7bd3ef49fa5d08: Status 404 returned error can't find the container with id 9550dd6bff329a3a9f8e8b1a6f5854799ee592651a4d6c793e7bd3ef49fa5d08 Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.674002 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.677976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.680483 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.683484 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.873610 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-prgds"] Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.875173 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.875639 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qz6j\" (UniqueName: \"kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.875693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.875828 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.877305 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.888421 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prgds"] Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.897025 4713 generic.go:334] "Generic (PLEG): container finished" podID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerID="043c27563acb3229add162d37def8105ec59e021adfa66e6839ae099e063d362" exitCode=0 Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.897448 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerDied","Data":"043c27563acb3229add162d37def8105ec59e021adfa66e6839ae099e063d362"} Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.897497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerStarted","Data":"45bd8e196d437bf814df8c5324cc0d7d99216575aee1f8127b19e67da4b9a899"} Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.899864 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerID="d12d1cf9941c11aa2dfeac0a9e719cbb41944ba9398373b25ab0e04846303a5f" exitCode=0 Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.899902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerDied","Data":"d12d1cf9941c11aa2dfeac0a9e719cbb41944ba9398373b25ab0e04846303a5f"} Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.899928 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerStarted","Data":"9550dd6bff329a3a9f8e8b1a6f5854799ee592651a4d6c793e7bd3ef49fa5d08"} Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.977922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.977998 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-catalog-content\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.978044 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqjr\" (UniqueName: \"kubernetes.io/projected/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-kube-api-access-bkqjr\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.978071 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qz6j\" (UniqueName: \"kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.978096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.978164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-utilities\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.979117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.979531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:52 crc kubenswrapper[4713]: I0314 05:34:52.996592 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qz6j\" (UniqueName: \"kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j\") pod \"community-operators-mgcsn\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.029561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.079032 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-catalog-content\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.079403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqjr\" (UniqueName: \"kubernetes.io/projected/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-kube-api-access-bkqjr\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.079524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-utilities\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.079575 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-catalog-content\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.079931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-utilities\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.096662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqjr\" (UniqueName: \"kubernetes.io/projected/58f78f5a-d3da-4bf6-bf82-c98dbbe9602f-kube-api-access-bkqjr\") pod \"certified-operators-prgds\" (UID: \"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f\") " pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.197930 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.203273 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 05:34:53 crc kubenswrapper[4713]: W0314 05:34:53.210536 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28e03d1_af1f_4f04_ac10_91fce1fde925.slice/crio-5616bc0c6a831d71f62de264dee036ae64c9a2a681529f2bb08ff79989847518 WatchSource:0}: Error finding container 5616bc0c6a831d71f62de264dee036ae64c9a2a681529f2bb08ff79989847518: Status 404 returned error can't find the container with id 5616bc0c6a831d71f62de264dee036ae64c9a2a681529f2bb08ff79989847518 Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.581910 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prgds"] Mar 14 05:34:53 crc kubenswrapper[4713]: W0314 05:34:53.584101 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58f78f5a_d3da_4bf6_bf82_c98dbbe9602f.slice/crio-50a8002f3d10bd4af5b25380c4013a3e27e89c37c40a9f8439e09e121785577c WatchSource:0}: Error finding container 50a8002f3d10bd4af5b25380c4013a3e27e89c37c40a9f8439e09e121785577c: Status 404 returned error can't find the container with id 50a8002f3d10bd4af5b25380c4013a3e27e89c37c40a9f8439e09e121785577c Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.906661 4713 generic.go:334] "Generic (PLEG): container finished" podID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerID="47edbe4e560c90a5e421fc848956b7dbad838bfffeaa11754ad0e257af9a6aa6" exitCode=0 Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.906749 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerDied","Data":"47edbe4e560c90a5e421fc848956b7dbad838bfffeaa11754ad0e257af9a6aa6"} Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.906801 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerStarted","Data":"5616bc0c6a831d71f62de264dee036ae64c9a2a681529f2bb08ff79989847518"} Mar 14 05:34:53 crc kubenswrapper[4713]: I0314 05:34:53.907482 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prgds" event={"ID":"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f","Type":"ContainerStarted","Data":"50a8002f3d10bd4af5b25380c4013a3e27e89c37c40a9f8439e09e121785577c"} Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.914759 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerStarted","Data":"94461d3537515d3237d886b0f3c41389a1169c6966cc506f56798a98cde352de"} Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.916382 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerID="9ef2af50968fec67220a234ac3b1880c6f1540636c012161bf8b62cd1e5a8013" exitCode=0 Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.916454 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerDied","Data":"9ef2af50968fec67220a234ac3b1880c6f1540636c012161bf8b62cd1e5a8013"} Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.918615 4713 generic.go:334] "Generic (PLEG): container finished" podID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerID="3984eb84944f361ea7b0baa6ad96f0c6ad6523d5e328ebc12d1f28350b0f5470" exitCode=0 Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.918689 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerDied","Data":"3984eb84944f361ea7b0baa6ad96f0c6ad6523d5e328ebc12d1f28350b0f5470"} Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.925892 4713 generic.go:334] "Generic (PLEG): container finished" podID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerID="04c7cc4611a2432816b4f5054b742d689390a559251a502b0bacc75175735598" exitCode=0 Mar 14 05:34:54 crc kubenswrapper[4713]: I0314 05:34:54.925933 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prgds" event={"ID":"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f","Type":"ContainerDied","Data":"04c7cc4611a2432816b4f5054b742d689390a559251a502b0bacc75175735598"} Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.009960 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-42m74" Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.064713 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.936340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerStarted","Data":"9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb"} Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.939456 4713 generic.go:334] "Generic (PLEG): container finished" podID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerID="94461d3537515d3237d886b0f3c41389a1169c6966cc506f56798a98cde352de" exitCode=0 Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.939513 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerDied","Data":"94461d3537515d3237d886b0f3c41389a1169c6966cc506f56798a98cde352de"} Mar 14 05:34:55 crc kubenswrapper[4713]: I0314 05:34:55.958194 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qhnqn" podStartSLOduration=3.519833944 podStartE2EDuration="5.958168822s" podCreationTimestamp="2026-03-14 05:34:50 +0000 UTC" firstStartedPulling="2026-03-14 05:34:52.90112024 +0000 UTC m=+475.989029550" lastFinishedPulling="2026-03-14 05:34:55.339455128 +0000 UTC m=+478.427364428" observedRunningTime="2026-03-14 05:34:55.951856495 +0000 UTC m=+479.039765795" watchObservedRunningTime="2026-03-14 05:34:55.958168822 +0000 UTC m=+479.046078142" Mar 14 05:34:56 crc kubenswrapper[4713]: I0314 05:34:56.953802 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerStarted","Data":"a20dd678b38e40c192cb2b14c7558caf95cd04333fa8aa371946fc600ec1f42e"} Mar 14 05:34:56 crc kubenswrapper[4713]: I0314 05:34:56.956586 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerStarted","Data":"59eb6b59ba6e54cb3e5c4aadf922604d1c2324e1b1542e24eab771fe130db99c"} Mar 14 05:34:56 crc kubenswrapper[4713]: I0314 05:34:56.965249 4713 generic.go:334] "Generic (PLEG): container finished" podID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerID="a7c250156a0708ee32de1f22ae553dd5965fd671268e89ee55f1474c1b5e48f8" exitCode=0 Mar 14 05:34:56 crc kubenswrapper[4713]: I0314 05:34:56.965437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prgds" event={"ID":"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f","Type":"ContainerDied","Data":"a7c250156a0708ee32de1f22ae553dd5965fd671268e89ee55f1474c1b5e48f8"} Mar 14 05:34:56 crc kubenswrapper[4713]: I0314 05:34:56.989602 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mgcsn" podStartSLOduration=2.554944807 podStartE2EDuration="4.989580488s" podCreationTimestamp="2026-03-14 05:34:52 +0000 UTC" firstStartedPulling="2026-03-14 05:34:53.90906313 +0000 UTC m=+476.996972430" lastFinishedPulling="2026-03-14 05:34:56.343698811 +0000 UTC m=+479.431608111" observedRunningTime="2026-03-14 05:34:56.974320184 +0000 UTC m=+480.062229514" watchObservedRunningTime="2026-03-14 05:34:56.989580488 +0000 UTC m=+480.077489788" Mar 14 05:34:57 crc kubenswrapper[4713]: I0314 05:34:57.021878 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-525cq" podStartSLOduration=3.95151984 podStartE2EDuration="7.021860228s" podCreationTimestamp="2026-03-14 05:34:50 +0000 UTC" firstStartedPulling="2026-03-14 05:34:52.898303473 +0000 UTC m=+475.986212773" lastFinishedPulling="2026-03-14 05:34:55.968643861 +0000 UTC m=+479.056553161" observedRunningTime="2026-03-14 05:34:57.020132888 +0000 UTC m=+480.108042198" watchObservedRunningTime="2026-03-14 05:34:57.021860228 +0000 UTC m=+480.109769528" Mar 14 05:34:57 crc kubenswrapper[4713]: I0314 05:34:57.971822 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prgds" event={"ID":"58f78f5a-d3da-4bf6-bf82-c98dbbe9602f","Type":"ContainerStarted","Data":"133fc1c39b357e31f16fca4d228383e8bdc8d639a6e9e3810ed9c0f21ec22669"} Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.645159 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.645775 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.718257 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.740346 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-prgds" podStartSLOduration=6.2946482790000005 podStartE2EDuration="8.740328389s" podCreationTimestamp="2026-03-14 05:34:52 +0000 UTC" firstStartedPulling="2026-03-14 05:34:54.92747853 +0000 UTC m=+478.015387830" lastFinishedPulling="2026-03-14 05:34:57.37315864 +0000 UTC m=+480.461067940" observedRunningTime="2026-03-14 05:34:57.995104685 +0000 UTC m=+481.083013985" watchObservedRunningTime="2026-03-14 05:35:00.740328389 +0000 UTC m=+483.828237689" Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.804989 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:35:00 crc kubenswrapper[4713]: I0314 05:35:00.805279 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:35:01 crc kubenswrapper[4713]: I0314 05:35:01.019611 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 05:35:01 crc kubenswrapper[4713]: I0314 05:35:01.866953 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 05:35:01 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:35:01 crc kubenswrapper[4713]: > Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.030628 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.030973 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.088981 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.198491 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.198560 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:35:03 crc kubenswrapper[4713]: I0314 05:35:03.252019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:35:04 crc kubenswrapper[4713]: I0314 05:35:04.041739 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-prgds" Mar 14 05:35:04 crc kubenswrapper[4713]: I0314 05:35:04.058043 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.731807 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.732331 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.732378 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.733073 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.733148 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998" gracePeriod=600 Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.850344 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:35:10 crc kubenswrapper[4713]: I0314 05:35:10.894457 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 05:35:12 crc kubenswrapper[4713]: I0314 05:35:12.041797 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998" exitCode=0 Mar 14 05:35:12 crc kubenswrapper[4713]: I0314 05:35:12.041872 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998"} Mar 14 05:35:12 crc kubenswrapper[4713]: I0314 05:35:12.042823 4713 scope.go:117] "RemoveContainer" containerID="1619405adcac63cb13b0884bf7c696b73cb2dc01839dec574159c6134c3afc38" Mar 14 05:35:13 crc kubenswrapper[4713]: I0314 05:35:13.049700 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673"} Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.100055 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" podUID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" containerName="registry" containerID="cri-o://16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5" gracePeriod=30 Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.476255 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.635700 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slc5q\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.635970 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636005 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636071 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636089 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636118 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls\") pod \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\" (UID: \"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a\") " Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.636846 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.637015 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.643392 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.644078 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.647618 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q" (OuterVolumeSpecName: "kube-api-access-slc5q") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "kube-api-access-slc5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.647979 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.650018 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.655635 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" (UID: "cedfba83-e56b-4913-9ac5-b5bbf3e71b7a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737385 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737414 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737423 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737432 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737443 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737451 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:20 crc kubenswrapper[4713]: I0314 05:35:20.737459 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slc5q\" (UniqueName: \"kubernetes.io/projected/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a-kube-api-access-slc5q\") on node \"crc\" DevicePath \"\"" Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.104458 4713 generic.go:334] "Generic (PLEG): container finished" podID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" containerID="16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5" exitCode=0 Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.104497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" event={"ID":"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a","Type":"ContainerDied","Data":"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5"} Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.104522 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" event={"ID":"cedfba83-e56b-4913-9ac5-b5bbf3e71b7a","Type":"ContainerDied","Data":"9d6eac711d2c530ab59214a55c323c95b4477e628d0e7bec4ca5bd4c42342df3"} Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.104538 4713 scope.go:117] "RemoveContainer" containerID="16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5" Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.104629 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rd5mn" Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.126435 4713 scope.go:117] "RemoveContainer" containerID="16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5" Mar 14 05:35:21 crc kubenswrapper[4713]: E0314 05:35:21.128887 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5\": container with ID starting with 16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5 not found: ID does not exist" containerID="16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5" Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.128934 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5"} err="failed to get container status \"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5\": rpc error: code = NotFound desc = could not find container \"16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5\": container with ID starting with 16952704f040fc1f02a1f7014e2489d538d9b50db193ea8ef2b5706d90d165a5 not found: ID does not exist" Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.132036 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.139989 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rd5mn"] Mar 14 05:35:21 crc kubenswrapper[4713]: I0314 05:35:21.571931 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" path="/var/lib/kubelet/pods/cedfba83-e56b-4913-9ac5-b5bbf3e71b7a/volumes" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.307288 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s"] Mar 14 05:35:42 crc kubenswrapper[4713]: E0314 05:35:42.308456 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" containerName="registry" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.308483 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" containerName="registry" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.308664 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedfba83-e56b-4913-9ac5-b5bbf3e71b7a" containerName="registry" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.309326 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.315683 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.315723 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s"] Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.315760 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.316093 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.316231 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.316284 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.326602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2286c66a-174f-4338-a3f9-a890f63e219e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.326702 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbt5r\" (UniqueName: \"kubernetes.io/projected/2286c66a-174f-4338-a3f9-a890f63e219e-kube-api-access-nbt5r\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.326750 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2286c66a-174f-4338-a3f9-a890f63e219e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.427923 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2286c66a-174f-4338-a3f9-a890f63e219e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.427978 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2286c66a-174f-4338-a3f9-a890f63e219e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.428100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbt5r\" (UniqueName: \"kubernetes.io/projected/2286c66a-174f-4338-a3f9-a890f63e219e-kube-api-access-nbt5r\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.431829 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2286c66a-174f-4338-a3f9-a890f63e219e-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.435184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2286c66a-174f-4338-a3f9-a890f63e219e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.449984 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbt5r\" (UniqueName: \"kubernetes.io/projected/2286c66a-174f-4338-a3f9-a890f63e219e-kube-api-access-nbt5r\") pod \"cluster-monitoring-operator-6d5b84845-xx48s\" (UID: \"2286c66a-174f-4338-a3f9-a890f63e219e\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:42 crc kubenswrapper[4713]: I0314 05:35:42.647301 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" Mar 14 05:35:43 crc kubenswrapper[4713]: I0314 05:35:43.109983 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s"] Mar 14 05:35:43 crc kubenswrapper[4713]: I0314 05:35:43.120411 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:35:43 crc kubenswrapper[4713]: I0314 05:35:43.245721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" event={"ID":"2286c66a-174f-4338-a3f9-a890f63e219e","Type":"ContainerStarted","Data":"e4c8bbaeb69b78678085960355d6ed91df18a133fa0b46763fb6fc9e51ca7858"} Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.258612 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" event={"ID":"2286c66a-174f-4338-a3f9-a890f63e219e","Type":"ContainerStarted","Data":"5f47d5501d2da6cb8153905866d4b4e2c8551dc28ea6de9f132d1af58157b4b2"} Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.280163 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-xx48s" podStartSLOduration=1.525010857 podStartE2EDuration="3.280142866s" podCreationTimestamp="2026-03-14 05:35:42 +0000 UTC" firstStartedPulling="2026-03-14 05:35:43.119967503 +0000 UTC m=+526.207876813" lastFinishedPulling="2026-03-14 05:35:44.875099532 +0000 UTC m=+527.963008822" observedRunningTime="2026-03-14 05:35:45.27965287 +0000 UTC m=+528.367562170" watchObservedRunningTime="2026-03-14 05:35:45.280142866 +0000 UTC m=+528.368052166" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.467145 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx"] Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.467957 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.469399 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qwkcp" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.469547 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.478736 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx"] Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.665814 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3d449c0-bf37-40e8-9e4c-14f586d1f0b3-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dhljx\" (UID: \"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.810474 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3d449c0-bf37-40e8-9e4c-14f586d1f0b3-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dhljx\" (UID: \"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:45 crc kubenswrapper[4713]: I0314 05:35:45.824025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3d449c0-bf37-40e8-9e4c-14f586d1f0b3-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dhljx\" (UID: \"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:46 crc kubenswrapper[4713]: I0314 05:35:46.082694 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:46 crc kubenswrapper[4713]: I0314 05:35:46.482176 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx"] Mar 14 05:35:47 crc kubenswrapper[4713]: I0314 05:35:47.284745 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" event={"ID":"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3","Type":"ContainerStarted","Data":"3eb8ab9a7d7069ed5e119331cec05fef45120b8030727e20198dc0ea2ea79717"} Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.292931 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" event={"ID":"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3","Type":"ContainerStarted","Data":"d75b4b698387d0513da2b6b82c82fe51ae18108798499ede476fbf54d04cd679"} Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.293331 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.298344 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.320187 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podStartSLOduration=1.780380563 podStartE2EDuration="3.320166532s" podCreationTimestamp="2026-03-14 05:35:45 +0000 UTC" firstStartedPulling="2026-03-14 05:35:46.496660542 +0000 UTC m=+529.584569852" lastFinishedPulling="2026-03-14 05:35:48.036446521 +0000 UTC m=+531.124355821" observedRunningTime="2026-03-14 05:35:48.316729031 +0000 UTC m=+531.404638341" watchObservedRunningTime="2026-03-14 05:35:48.320166532 +0000 UTC m=+531.408075832" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.527559 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-qvlxc"] Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.528584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.530334 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vz7zm" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.530574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.530832 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.530963 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.550175 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-qvlxc"] Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.646250 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.646309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.646353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.646659 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gct\" (UniqueName: \"kubernetes.io/projected/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-kube-api-access-45gct\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.747896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gct\" (UniqueName: \"kubernetes.io/projected/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-kube-api-access-45gct\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.747981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.748048 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.748084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.749912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.755275 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.756249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.768454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gct\" (UniqueName: \"kubernetes.io/projected/9bec0fa6-53cc-43e9-b5b1-e23f1965067b-kube-api-access-45gct\") pod \"prometheus-operator-db54df47d-qvlxc\" (UID: \"9bec0fa6-53cc-43e9-b5b1-e23f1965067b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:48 crc kubenswrapper[4713]: I0314 05:35:48.848642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" Mar 14 05:35:49 crc kubenswrapper[4713]: I0314 05:35:49.035228 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-qvlxc"] Mar 14 05:35:49 crc kubenswrapper[4713]: I0314 05:35:49.300486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" event={"ID":"9bec0fa6-53cc-43e9-b5b1-e23f1965067b","Type":"ContainerStarted","Data":"c100d9b15fa74e66094985b96b91f196cbe06ff7a71cfc840f3b5c420ff846f9"} Mar 14 05:35:51 crc kubenswrapper[4713]: I0314 05:35:51.319014 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" event={"ID":"9bec0fa6-53cc-43e9-b5b1-e23f1965067b","Type":"ContainerStarted","Data":"0ccf77979c9278cb5db3868e956d7017d0ebc9715f53d58d5b90803a748c2f73"} Mar 14 05:35:51 crc kubenswrapper[4713]: I0314 05:35:51.319103 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" event={"ID":"9bec0fa6-53cc-43e9-b5b1-e23f1965067b","Type":"ContainerStarted","Data":"42634f46f3a4cecb28224c8ed9dd2b9c85c0117e3a4a9d3ae038e69ec3006d7d"} Mar 14 05:35:51 crc kubenswrapper[4713]: I0314 05:35:51.337378 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-qvlxc" podStartSLOduration=1.729258744 podStartE2EDuration="3.337345572s" podCreationTimestamp="2026-03-14 05:35:48 +0000 UTC" firstStartedPulling="2026-03-14 05:35:49.045158571 +0000 UTC m=+532.133067871" lastFinishedPulling="2026-03-14 05:35:50.653245359 +0000 UTC m=+533.741154699" observedRunningTime="2026-03-14 05:35:51.335337428 +0000 UTC m=+534.423246758" watchObservedRunningTime="2026-03-14 05:35:51.337345572 +0000 UTC m=+534.425254902" Mar 14 05:35:52 crc kubenswrapper[4713]: I0314 05:35:52.906171 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-56x7x"] Mar 14 05:35:52 crc kubenswrapper[4713]: I0314 05:35:52.907795 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:52 crc kubenswrapper[4713]: W0314 05:35:52.910626 4713 reflector.go:561] object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-j2xxn": failed to list *v1.Secret: secrets "openshift-state-metrics-dockercfg-j2xxn" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Mar 14 05:35:52 crc kubenswrapper[4713]: E0314 05:35:52.910698 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-j2xxn\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-state-metrics-dockercfg-j2xxn\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 05:35:52 crc kubenswrapper[4713]: W0314 05:35:52.917638 4713 reflector.go:561] object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config": failed to list *v1.Secret: secrets "openshift-state-metrics-kube-rbac-proxy-config" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Mar 14 05:35:52 crc kubenswrapper[4713]: E0314 05:35:52.917700 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-state-metrics-kube-rbac-proxy-config\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 05:35:52 crc kubenswrapper[4713]: W0314 05:35:52.918436 4713 reflector.go:561] object-"openshift-monitoring"/"openshift-state-metrics-tls": failed to list *v1.Secret: secrets "openshift-state-metrics-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'crc' and this object Mar 14 05:35:52 crc kubenswrapper[4713]: E0314 05:35:52.918507 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-state-metrics-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 05:35:52 crc kubenswrapper[4713]: I0314 05:35:52.960030 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-56x7x"] Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.009299 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.009506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.009567 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c0bb14-83f9-47aa-a294-9c0a47b39921-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.009678 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xs5l\" (UniqueName: \"kubernetes.io/projected/87c0bb14-83f9-47aa-a294-9c0a47b39921-kube-api-access-5xs5l\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.015076 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl"] Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.016292 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.028116 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9tbzw" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.028270 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.028540 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.028686 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.039767 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl"] Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.048550 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-48s5q"] Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.049668 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.060625 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.061313 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-66qh7" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.069566 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f736f471-5e50-47ba-8449-b7b90f774e5e-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111681 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111764 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72pl\" (UniqueName: \"kubernetes.io/projected/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-api-access-f72pl\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111788 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-metrics-client-ca\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111864 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c0bb14-83f9-47aa-a294-9c0a47b39921-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111911 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xs5l\" (UniqueName: \"kubernetes.io/projected/87c0bb14-83f9-47aa-a294-9c0a47b39921-kube-api-access-5xs5l\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.111947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112045 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-sys\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112096 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-wtmp\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112121 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-root\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112142 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-textfile\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.112172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9v7s\" (UniqueName: \"kubernetes.io/projected/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-kube-api-access-c9v7s\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.113507 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c0bb14-83f9-47aa-a294-9c0a47b39921-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.153725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xs5l\" (UniqueName: \"kubernetes.io/projected/87c0bb14-83f9-47aa-a294-9c0a47b39921-kube-api-access-5xs5l\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214678 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-sys\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214837 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-wtmp\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214866 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-root\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214888 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-textfile\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9v7s\" (UniqueName: \"kubernetes.io/projected/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-kube-api-access-c9v7s\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214944 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f736f471-5e50-47ba-8449-b7b90f774e5e-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214976 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.214999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.215025 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.215048 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72pl\" (UniqueName: \"kubernetes.io/projected/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-api-access-f72pl\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.215070 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-metrics-client-ca\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.215112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.215152 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.216421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-wtmp\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.216773 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f736f471-5e50-47ba-8449-b7b90f774e5e-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.216854 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-root\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.217169 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-textfile\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.217455 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.217515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-sys\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: E0314 05:35:53.217984 4713 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 14 05:35:53 crc kubenswrapper[4713]: E0314 05:35:53.218078 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls podName:52de4d31-2b95-4459-9e38-7f81f8dc2d0a nodeName:}" failed. No retries permitted until 2026-03-14 05:35:53.718051703 +0000 UTC m=+536.805961003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls") pod "node-exporter-48s5q" (UID: "52de4d31-2b95-4459-9e38-7f81f8dc2d0a") : secret "node-exporter-tls" not found Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.218687 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f736f471-5e50-47ba-8449-b7b90f774e5e-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.221903 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.222666 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.218087 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-metrics-client-ca\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.230988 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.236701 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72pl\" (UniqueName: \"kubernetes.io/projected/f736f471-5e50-47ba-8449-b7b90f774e5e-kube-api-access-f72pl\") pod \"kube-state-metrics-777cb5bd5d-lp9bl\" (UID: \"f736f471-5e50-47ba-8449-b7b90f774e5e\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.249944 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9v7s\" (UniqueName: \"kubernetes.io/projected/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-kube-api-access-c9v7s\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.332521 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.544913 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl"] Mar 14 05:35:53 crc kubenswrapper[4713]: W0314 05:35:53.555369 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf736f471_5e50_47ba_8449_b7b90f774e5e.slice/crio-2b0078bacf7a8b1682adfadd708c3af362cc253ce92383465fead63ab6702c85 WatchSource:0}: Error finding container 2b0078bacf7a8b1682adfadd708c3af362cc253ce92383465fead63ab6702c85: Status 404 returned error can't find the container with id 2b0078bacf7a8b1682adfadd708c3af362cc253ce92383465fead63ab6702c85 Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.723224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.727902 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52de4d31-2b95-4459-9e38-7f81f8dc2d0a-node-exporter-tls\") pod \"node-exporter-48s5q\" (UID: \"52de4d31-2b95-4459-9e38-7f81f8dc2d0a\") " pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:53 crc kubenswrapper[4713]: I0314 05:35:53.974160 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-48s5q" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.005346 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.007720 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: W0314 05:35:54.008770 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52de4d31_2b95_4459_9e38_7f81f8dc2d0a.slice/crio-5cc84b57aefbd0fa39ea3660450dd7335914b8937b435647df5b736419418f91 WatchSource:0}: Error finding container 5cc84b57aefbd0fa39ea3660450dd7335914b8937b435647df5b736419418f91: Status 404 returned error can't find the container with id 5cc84b57aefbd0fa39ea3660450dd7335914b8937b435647df5b736419418f91 Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.011976 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.012155 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.012426 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-hdbxf" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.012537 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.013218 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.013432 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.013507 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.014279 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.019111 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.032134 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 14 05:35:54 crc kubenswrapper[4713]: E0314 05:35:54.113142 4713 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 14 05:35:54 crc kubenswrapper[4713]: E0314 05:35:54.113313 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config podName:87c0bb14-83f9-47aa-a294-9c0a47b39921 nodeName:}" failed. No retries permitted until 2026-03-14 05:35:54.613278791 +0000 UTC m=+537.701188091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-566fddb674-56x7x" (UID: "87c0bb14-83f9-47aa-a294-9c0a47b39921") : failed to sync secret cache: timed out waiting for the condition Mar 14 05:35:54 crc kubenswrapper[4713]: E0314 05:35:54.113534 4713 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 14 05:35:54 crc kubenswrapper[4713]: E0314 05:35:54.113575 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls podName:87c0bb14-83f9-47aa-a294-9c0a47b39921 nodeName:}" failed. No retries permitted until 2026-03-14 05:35:54.61356775 +0000 UTC m=+537.701477050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-56x7x" (UID: "87c0bb14-83f9-47aa-a294-9c0a47b39921") : failed to sync secret cache: timed out waiting for the condition Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.125890 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-j2xxn" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-config-out\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130824 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130881 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130934 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130965 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-587nv\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-kube-api-access-587nv\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.130998 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131085 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-web-config\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131115 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-config-volume\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131166 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131185 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.131376 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.177607 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.211467 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.232826 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-587nv\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-kube-api-access-587nv\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.232876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.232940 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-web-config\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.232960 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-config-volume\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.232993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233035 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-config-out\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233111 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233142 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.233160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.234626 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.234926 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.235836 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c40366-5578-424e-aeef-eac9a128181f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.239824 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-web-config\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.241069 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.241314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-config-volume\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.241636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35c40366-5578-424e-aeef-eac9a128181f-config-out\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.242704 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.242890 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.244465 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.248415 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35c40366-5578-424e-aeef-eac9a128181f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.249382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-587nv\" (UniqueName: \"kubernetes.io/projected/35c40366-5578-424e-aeef-eac9a128181f-kube-api-access-587nv\") pod \"alertmanager-main-0\" (UID: \"35c40366-5578-424e-aeef-eac9a128181f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.329580 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.338097 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" event={"ID":"f736f471-5e50-47ba-8449-b7b90f774e5e","Type":"ContainerStarted","Data":"2b0078bacf7a8b1682adfadd708c3af362cc253ce92383465fead63ab6702c85"} Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.339235 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-48s5q" event={"ID":"52de4d31-2b95-4459-9e38-7f81f8dc2d0a","Type":"ContainerStarted","Data":"5cc84b57aefbd0fa39ea3660450dd7335914b8937b435647df5b736419418f91"} Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.559694 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 14 05:35:54 crc kubenswrapper[4713]: W0314 05:35:54.567960 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c40366_5578_424e_aeef_eac9a128181f.slice/crio-424351a7898bcc6b889112df7b0df1b8e849c75ea2d7289f46a8c1b30ab1fe7c WatchSource:0}: Error finding container 424351a7898bcc6b889112df7b0df1b8e849c75ea2d7289f46a8c1b30ab1fe7c: Status 404 returned error can't find the container with id 424351a7898bcc6b889112df7b0df1b8e849c75ea2d7289f46a8c1b30ab1fe7c Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.639467 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.639586 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.645421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.649191 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87c0bb14-83f9-47aa-a294-9c0a47b39921-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-56x7x\" (UID: \"87c0bb14-83f9-47aa-a294-9c0a47b39921\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.724528 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" Mar 14 05:35:54 crc kubenswrapper[4713]: I0314 05:35:54.952926 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-56x7x"] Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.001532 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-596654c596-mpwzl"] Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.003096 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.011878 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-zxlss" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.011882 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.012078 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-5hngcqacjjhqt" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.012252 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.012273 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.012346 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.012428 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.024075 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-596654c596-mpwzl"] Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.048946 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvg7r\" (UniqueName: \"kubernetes.io/projected/aee49a16-349d-4656-a0d0-c78cb70ca08f-kube-api-access-tvg7r\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049019 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049054 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049121 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-grpc-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aee49a16-349d-4656-a0d0-c78cb70ca08f-metrics-client-ca\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.049499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151249 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvg7r\" (UniqueName: \"kubernetes.io/projected/aee49a16-349d-4656-a0d0-c78cb70ca08f-kube-api-access-tvg7r\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151309 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-grpc-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151435 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151463 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aee49a16-349d-4656-a0d0-c78cb70ca08f-metrics-client-ca\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.151487 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.158264 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.159319 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aee49a16-349d-4656-a0d0-c78cb70ca08f-metrics-client-ca\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.163678 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.164063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.164145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.166241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.173147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aee49a16-349d-4656-a0d0-c78cb70ca08f-secret-grpc-tls\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.176523 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvg7r\" (UniqueName: \"kubernetes.io/projected/aee49a16-349d-4656-a0d0-c78cb70ca08f-kube-api-access-tvg7r\") pod \"thanos-querier-596654c596-mpwzl\" (UID: \"aee49a16-349d-4656-a0d0-c78cb70ca08f\") " pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.324961 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.347962 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" event={"ID":"87c0bb14-83f9-47aa-a294-9c0a47b39921","Type":"ContainerStarted","Data":"92f5342c107d83793d12c2385eb4fa1a989d06af2a84bd60b6a5b50fff3356ba"} Mar 14 05:35:55 crc kubenswrapper[4713]: I0314 05:35:55.349787 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"424351a7898bcc6b889112df7b0df1b8e849c75ea2d7289f46a8c1b30ab1fe7c"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.051192 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-596654c596-mpwzl"] Mar 14 05:35:56 crc kubenswrapper[4713]: W0314 05:35:56.073459 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee49a16_349d_4656_a0d0_c78cb70ca08f.slice/crio-f43999faf63faf6544f7504d6206081b1e82f4d712ef499f09a3db8e14de985f WatchSource:0}: Error finding container f43999faf63faf6544f7504d6206081b1e82f4d712ef499f09a3db8e14de985f: Status 404 returned error can't find the container with id f43999faf63faf6544f7504d6206081b1e82f4d712ef499f09a3db8e14de985f Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.360673 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" event={"ID":"87c0bb14-83f9-47aa-a294-9c0a47b39921","Type":"ContainerStarted","Data":"163e04bfda3978f630a1a05537ee115fd659e54a4daad097fbc2014e43ab07c4"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.360725 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" event={"ID":"87c0bb14-83f9-47aa-a294-9c0a47b39921","Type":"ContainerStarted","Data":"7e45adcae2641cede96722a0d27b7043d96c51359933ecd7d29435911d876fd0"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.362834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" event={"ID":"f736f471-5e50-47ba-8449-b7b90f774e5e","Type":"ContainerStarted","Data":"ff962e66f0a059444f9d9e433a618e4a5e8e0a93c013973f2ad89b2f5c3a8a19"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.362859 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" event={"ID":"f736f471-5e50-47ba-8449-b7b90f774e5e","Type":"ContainerStarted","Data":"fe7024a1bd766e699ab086faba3a7487cdfc81cc0ec2372ecefd65cba3bf96fb"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.362868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" event={"ID":"f736f471-5e50-47ba-8449-b7b90f774e5e","Type":"ContainerStarted","Data":"09ad315a43e322cc9c1b4af3a92928a2d6efcaba2eb01c23298cf8456d0644b2"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.365109 4713 generic.go:334] "Generic (PLEG): container finished" podID="52de4d31-2b95-4459-9e38-7f81f8dc2d0a" containerID="74c9c2c29e7e63beaf0eebc4659f6188996b295b23f8e8dd9c32e10353b9584f" exitCode=0 Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.365175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-48s5q" event={"ID":"52de4d31-2b95-4459-9e38-7f81f8dc2d0a","Type":"ContainerDied","Data":"74c9c2c29e7e63beaf0eebc4659f6188996b295b23f8e8dd9c32e10353b9584f"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.366679 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"f43999faf63faf6544f7504d6206081b1e82f4d712ef499f09a3db8e14de985f"} Mar 14 05:35:56 crc kubenswrapper[4713]: I0314 05:35:56.387327 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-lp9bl" podStartSLOduration=2.33794964 podStartE2EDuration="4.387294227s" podCreationTimestamp="2026-03-14 05:35:52 +0000 UTC" firstStartedPulling="2026-03-14 05:35:53.557792145 +0000 UTC m=+536.645701445" lastFinishedPulling="2026-03-14 05:35:55.607136732 +0000 UTC m=+538.695046032" observedRunningTime="2026-03-14 05:35:56.381590414 +0000 UTC m=+539.469499734" watchObservedRunningTime="2026-03-14 05:35:56.387294227 +0000 UTC m=+539.475203537" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.375260 4713 generic.go:334] "Generic (PLEG): container finished" podID="35c40366-5578-424e-aeef-eac9a128181f" containerID="8170ff0b45bbcf7858dadb0af8662bdde49edca2de8e6792a2b466ede398d031" exitCode=0 Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.375334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerDied","Data":"8170ff0b45bbcf7858dadb0af8662bdde49edca2de8e6792a2b466ede398d031"} Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.388883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-48s5q" event={"ID":"52de4d31-2b95-4459-9e38-7f81f8dc2d0a","Type":"ContainerStarted","Data":"320bd32c76b09140259b5889f4778d154e012fb98cb110c91bb55edb308fb210"} Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.388933 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-48s5q" event={"ID":"52de4d31-2b95-4459-9e38-7f81f8dc2d0a","Type":"ContainerStarted","Data":"9de4456db46bf1ee7b789079226d4028f43df200b4abe411a18cbcb48f995be5"} Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.434700 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-48s5q" podStartSLOduration=3.854220982 podStartE2EDuration="5.434681961s" podCreationTimestamp="2026-03-14 05:35:52 +0000 UTC" firstStartedPulling="2026-03-14 05:35:54.016873358 +0000 UTC m=+537.104782658" lastFinishedPulling="2026-03-14 05:35:55.597334337 +0000 UTC m=+538.685243637" observedRunningTime="2026-03-14 05:35:57.430149285 +0000 UTC m=+540.518058595" watchObservedRunningTime="2026-03-14 05:35:57.434681961 +0000 UTC m=+540.522591261" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.730412 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.731562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.777424 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817052 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2hr\" (UniqueName: \"kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817249 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817287 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817311 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.817352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918746 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918847 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918893 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918924 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.918945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2hr\" (UniqueName: \"kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.920916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.922911 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.923080 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.923306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.927586 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.929486 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:57 crc kubenswrapper[4713]: I0314 05:35:57.936302 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2hr\" (UniqueName: \"kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr\") pod \"console-76b6b6b6bb-wvhf8\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.051695 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.298245 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6d5f446985-q8pw2"] Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.299476 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.304898 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.305416 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.308197 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-dcvp7" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.308506 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-82jlg307lf8t3" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.308716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.308851 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.310631 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6d5f446985-q8pw2"] Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425555 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-client-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425672 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7cea157-995e-400c-b2ee-85357ae7fb7b-audit-log\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425761 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpclr\" (UniqueName: \"kubernetes.io/projected/d7cea157-995e-400c-b2ee-85357ae7fb7b-kube-api-access-lpclr\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425812 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-metrics-server-audit-profiles\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-server-tls\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.425933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-client-certs\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.527774 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-client-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528309 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7cea157-995e-400c-b2ee-85357ae7fb7b-audit-log\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpclr\" (UniqueName: \"kubernetes.io/projected/d7cea157-995e-400c-b2ee-85357ae7fb7b-kube-api-access-lpclr\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528386 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-metrics-server-audit-profiles\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528419 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-server-tls\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-client-certs\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.528501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.529048 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7cea157-995e-400c-b2ee-85357ae7fb7b-audit-log\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.529999 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.530694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7cea157-995e-400c-b2ee-85357ae7fb7b-metrics-server-audit-profiles\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.534505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-client-certs\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.538187 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-client-ca-bundle\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.539224 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7cea157-995e-400c-b2ee-85357ae7fb7b-secret-metrics-server-tls\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.548935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpclr\" (UniqueName: \"kubernetes.io/projected/d7cea157-995e-400c-b2ee-85357ae7fb7b-kube-api-access-lpclr\") pod \"metrics-server-6d5f446985-q8pw2\" (UID: \"d7cea157-995e-400c-b2ee-85357ae7fb7b\") " pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.666961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:35:58 crc kubenswrapper[4713]: W0314 05:35:58.675909 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b5585b_519e_4399_b6ef_f4e8d0641705.slice/crio-0189fe34c4e4e6063568966ccd4d19abbb552303bab35142fb0a687782b10e21 WatchSource:0}: Error finding container 0189fe34c4e4e6063568966ccd4d19abbb552303bab35142fb0a687782b10e21: Status 404 returned error can't find the container with id 0189fe34c4e4e6063568966ccd4d19abbb552303bab35142fb0a687782b10e21 Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.694036 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.709869 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-84469c67d6-74jtt"] Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.710657 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.713548 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.713752 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.734770 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-84469c67d6-74jtt"] Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.842766 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/88a15bde-288a-4e1f-b537-7127832ecb65-monitoring-plugin-cert\") pod \"monitoring-plugin-84469c67d6-74jtt\" (UID: \"88a15bde-288a-4e1f-b537-7127832ecb65\") " pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.948473 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/88a15bde-288a-4e1f-b537-7127832ecb65-monitoring-plugin-cert\") pod \"monitoring-plugin-84469c67d6-74jtt\" (UID: \"88a15bde-288a-4e1f-b537-7127832ecb65\") " pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:35:58 crc kubenswrapper[4713]: I0314 05:35:58.963693 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/88a15bde-288a-4e1f-b537-7127832ecb65-monitoring-plugin-cert\") pod \"monitoring-plugin-84469c67d6-74jtt\" (UID: \"88a15bde-288a-4e1f-b537-7127832ecb65\") " pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.038641 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.136699 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6d5f446985-q8pw2"] Mar 14 05:35:59 crc kubenswrapper[4713]: W0314 05:35:59.145772 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7cea157_995e_400c_b2ee_85357ae7fb7b.slice/crio-87dcaecab819cd9cf0e6715157affc9c933b9a0a005b836db8b598509ff9d62f WatchSource:0}: Error finding container 87dcaecab819cd9cf0e6715157affc9c933b9a0a005b836db8b598509ff9d62f: Status 404 returned error can't find the container with id 87dcaecab819cd9cf0e6715157affc9c933b9a0a005b836db8b598509ff9d62f Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.405935 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.409067 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" event={"ID":"87c0bb14-83f9-47aa-a294-9c0a47b39921","Type":"ContainerStarted","Data":"74908663b409adeedd95c91116fa6e397240eaa46844ee35eec4346667ffaf54"} Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.412177 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.412353 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b6b6b6bb-wvhf8" event={"ID":"69b5585b-519e-4399-b6ef-f4e8d0641705","Type":"ContainerStarted","Data":"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e"} Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.412409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b6b6b6bb-wvhf8" event={"ID":"69b5585b-519e-4399-b6ef-f4e8d0641705","Type":"ContainerStarted","Data":"0189fe34c4e4e6063568966ccd4d19abbb552303bab35142fb0a687782b10e21"} Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.418372 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.418691 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-5r4bcti23950e" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.418822 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.418954 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419244 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419260 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-z9hwb" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419374 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419432 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419480 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.419549 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.421079 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" event={"ID":"d7cea157-995e-400c-b2ee-85357ae7fb7b","Type":"ContainerStarted","Data":"87dcaecab819cd9cf0e6715157affc9c933b9a0a005b836db8b598509ff9d62f"} Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.423739 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.428328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.443552 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-56x7x" podStartSLOduration=5.199056679 podStartE2EDuration="7.443529235s" podCreationTimestamp="2026-03-14 05:35:52 +0000 UTC" firstStartedPulling="2026-03-14 05:35:56.023067136 +0000 UTC m=+539.110976436" lastFinishedPulling="2026-03-14 05:35:58.267539692 +0000 UTC m=+541.355448992" observedRunningTime="2026-03-14 05:35:59.439643569 +0000 UTC m=+542.527552869" watchObservedRunningTime="2026-03-14 05:35:59.443529235 +0000 UTC m=+542.531438535" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.457322 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.533100 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76b6b6b6bb-wvhf8" podStartSLOduration=2.533080666 podStartE2EDuration="2.533080666s" podCreationTimestamp="2026-03-14 05:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:35:59.529024885 +0000 UTC m=+542.616934175" watchObservedRunningTime="2026-03-14 05:35:59.533080666 +0000 UTC m=+542.620989966" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.563003 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-84469c67d6-74jtt"] Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.568921 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-web-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569029 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569054 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569089 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569155 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569191 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569235 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config-out\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569277 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569300 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569345 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjvv\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-kube-api-access-rbjvv\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569393 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569455 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569523 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.569562 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672587 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config-out\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672723 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672746 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjvv\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-kube-api-access-rbjvv\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672820 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672886 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672908 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-web-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.672999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.673019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.673044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.674392 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.674798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.675470 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.680600 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.681159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.682444 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.682992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.683488 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.684185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.686239 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-web-config\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.686036 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.687716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.688303 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-config-out\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.690339 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.690622 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.693428 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.697983 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjvv\" (UniqueName: \"kubernetes.io/projected/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-kube-api-access-rbjvv\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.698286 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/249e8a7d-5c1c-4d41-a243-6ab0ad96094c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"249e8a7d-5c1c-4d41-a243-6ab0ad96094c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:35:59 crc kubenswrapper[4713]: I0314 05:35:59.737462 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.135697 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557776-fd952"] Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.137161 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.143535 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.145323 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.145865 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.150621 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557776-fd952"] Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.282458 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8w9\" (UniqueName: \"kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9\") pod \"auto-csr-approver-29557776-fd952\" (UID: \"a5e5d46f-370a-410c-a86e-e0ae032a3b35\") " pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.386042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8w9\" (UniqueName: \"kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9\") pod \"auto-csr-approver-29557776-fd952\" (UID: \"a5e5d46f-370a-410c-a86e-e0ae032a3b35\") " pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.419783 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8w9\" (UniqueName: \"kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9\") pod \"auto-csr-approver-29557776-fd952\" (UID: \"a5e5d46f-370a-410c-a86e-e0ae032a3b35\") " pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:00 crc kubenswrapper[4713]: I0314 05:36:00.464699 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:01 crc kubenswrapper[4713]: I0314 05:36:01.336909 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557776-fd952"] Mar 14 05:36:01 crc kubenswrapper[4713]: I0314 05:36:01.411345 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 14 05:36:01 crc kubenswrapper[4713]: I0314 05:36:01.434795 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" event={"ID":"88a15bde-288a-4e1f-b537-7127832ecb65","Type":"ContainerStarted","Data":"d6071213ccb29986a03770e5d08338069218fa7cd0bcf92470baec19447f5292"} Mar 14 05:36:01 crc kubenswrapper[4713]: W0314 05:36:01.758031 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249e8a7d_5c1c_4d41_a243_6ab0ad96094c.slice/crio-67d01df34ba36acca3c9efe14cf3ab77b117281dad40d0f0505ddc7b028ab7ce WatchSource:0}: Error finding container 67d01df34ba36acca3c9efe14cf3ab77b117281dad40d0f0505ddc7b028ab7ce: Status 404 returned error can't find the container with id 67d01df34ba36acca3c9efe14cf3ab77b117281dad40d0f0505ddc7b028ab7ce Mar 14 05:36:01 crc kubenswrapper[4713]: W0314 05:36:01.761639 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e5d46f_370a_410c_a86e_e0ae032a3b35.slice/crio-262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61 WatchSource:0}: Error finding container 262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61: Status 404 returned error can't find the container with id 262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61 Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.447982 4713 generic.go:334] "Generic (PLEG): container finished" podID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerID="8b49692347b0f693b851ce7f58675ee0bc7d22e029a996241a6c28a1976b5610" exitCode=0 Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.448072 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerDied","Data":"8b49692347b0f693b851ce7f58675ee0bc7d22e029a996241a6c28a1976b5610"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.448621 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"67d01df34ba36acca3c9efe14cf3ab77b117281dad40d0f0505ddc7b028ab7ce"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.452663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557776-fd952" event={"ID":"a5e5d46f-370a-410c-a86e-e0ae032a3b35","Type":"ContainerStarted","Data":"262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.456310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"b67a944fca781bbc0d4ec99627fc72fff38192b53b39f0a221e276ef7f9bf858"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.456344 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"09b3649dac68feca0ff023ebf206bbc28d1e92e0b0e9ae38070de334f8934ca1"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.456360 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"eaa500c09bdd378d3af0a1d0f30a73b9a4478438889d817d822a21afa7371c64"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.463266 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"3de1fcfb2698f63d72b2ffb7a27c1506f4a11204631dd188c40774644d9ca5dc"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.463342 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"aa41298d8e80fdf9a4cfdb29599f3ce97cf7cd9aa426dc7ad56b6cc3e447e3b1"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.463356 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"f4d6f630f0dd345ee98fe293bb2de12d57234547eb658a6acc86b845a2b34e51"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.465962 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" event={"ID":"d7cea157-995e-400c-b2ee-85357ae7fb7b","Type":"ContainerStarted","Data":"6a51f0f57d059e36da820b4d22f05bd917ee3f70f54b4ba1f35b3ebf56efeb80"} Mar 14 05:36:02 crc kubenswrapper[4713]: I0314 05:36:02.517308 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podStartSLOduration=1.8585341290000001 podStartE2EDuration="4.517276996s" podCreationTimestamp="2026-03-14 05:35:58 +0000 UTC" firstStartedPulling="2026-03-14 05:35:59.151331012 +0000 UTC m=+542.239240312" lastFinishedPulling="2026-03-14 05:36:01.810073879 +0000 UTC m=+544.897983179" observedRunningTime="2026-03-14 05:36:02.510553589 +0000 UTC m=+545.598462889" watchObservedRunningTime="2026-03-14 05:36:02.517276996 +0000 UTC m=+545.605186296" Mar 14 05:36:03 crc kubenswrapper[4713]: I0314 05:36:03.475755 4713 generic.go:334] "Generic (PLEG): container finished" podID="a5e5d46f-370a-410c-a86e-e0ae032a3b35" containerID="b4cca9795aafe53ce4a49800ec57f6107a322b3a1b2af1bdab1be40845a65cc2" exitCode=0 Mar 14 05:36:03 crc kubenswrapper[4713]: I0314 05:36:03.475860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557776-fd952" event={"ID":"a5e5d46f-370a-410c-a86e-e0ae032a3b35","Type":"ContainerDied","Data":"b4cca9795aafe53ce4a49800ec57f6107a322b3a1b2af1bdab1be40845a65cc2"} Mar 14 05:36:03 crc kubenswrapper[4713]: I0314 05:36:03.481759 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"576eef9aec30a30a08a0d285458b089de2f8a792e5f2d14649231e526e9fc545"} Mar 14 05:36:03 crc kubenswrapper[4713]: I0314 05:36:03.481830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"7275900fe4372d6dd91928207387cbb1130ffeaaa8d2970fa13fa8d075a1f44b"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.492747 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"aa1c078870a6e28b55058be49f505832b14c9429cfef43ad4ac6b8773942e6ea"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.492856 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"e4e3dd5ca55f3b749f3487672bb7f2794f1dded9f5b2f1776a808b7fcf289586"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.492875 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" event={"ID":"aee49a16-349d-4656-a0d0-c78cb70ca08f","Type":"ContainerStarted","Data":"8ecbf560e22a5a95628c61f9b835b9ed406becb0e0364a14830e09b2a525cd77"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.493926 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.500512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35c40366-5578-424e-aeef-eac9a128181f","Type":"ContainerStarted","Data":"fb1c78a69a6503506adf7a473d114ba570879010ad3e5c2bfbd776980c5e9205"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.502321 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" event={"ID":"88a15bde-288a-4e1f-b537-7127832ecb65","Type":"ContainerStarted","Data":"60ec2cf3e4bda861d213da283a19011ce715fe2ef3a462491b9f2625cb7b54f0"} Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.502537 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.507949 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.519796 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" podStartSLOduration=2.826747318 podStartE2EDuration="10.519770735s" podCreationTimestamp="2026-03-14 05:35:54 +0000 UTC" firstStartedPulling="2026-03-14 05:35:56.075955718 +0000 UTC m=+539.163865018" lastFinishedPulling="2026-03-14 05:36:03.768979135 +0000 UTC m=+546.856888435" observedRunningTime="2026-03-14 05:36:04.517776011 +0000 UTC m=+547.605685321" watchObservedRunningTime="2026-03-14 05:36:04.519770735 +0000 UTC m=+547.607680035" Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.547891 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.358799791 podStartE2EDuration="11.54787035s" podCreationTimestamp="2026-03-14 05:35:53 +0000 UTC" firstStartedPulling="2026-03-14 05:35:54.571542577 +0000 UTC m=+537.659451877" lastFinishedPulling="2026-03-14 05:36:03.760613136 +0000 UTC m=+546.848522436" observedRunningTime="2026-03-14 05:36:04.546187765 +0000 UTC m=+547.634097075" watchObservedRunningTime="2026-03-14 05:36:04.54787035 +0000 UTC m=+547.635779650" Mar 14 05:36:04 crc kubenswrapper[4713]: I0314 05:36:04.567540 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" podStartSLOduration=3.913858628 podStartE2EDuration="6.567497421s" podCreationTimestamp="2026-03-14 05:35:58 +0000 UTC" firstStartedPulling="2026-03-14 05:36:01.10750345 +0000 UTC m=+544.195412750" lastFinishedPulling="2026-03-14 05:36:03.761142243 +0000 UTC m=+546.849051543" observedRunningTime="2026-03-14 05:36:04.566953184 +0000 UTC m=+547.654862494" watchObservedRunningTime="2026-03-14 05:36:04.567497421 +0000 UTC m=+547.655406721" Mar 14 05:36:05 crc kubenswrapper[4713]: I0314 05:36:05.849248 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:05 crc kubenswrapper[4713]: I0314 05:36:05.987140 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h8w9\" (UniqueName: \"kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9\") pod \"a5e5d46f-370a-410c-a86e-e0ae032a3b35\" (UID: \"a5e5d46f-370a-410c-a86e-e0ae032a3b35\") " Mar 14 05:36:05 crc kubenswrapper[4713]: I0314 05:36:05.992991 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9" (OuterVolumeSpecName: "kube-api-access-4h8w9") pod "a5e5d46f-370a-410c-a86e-e0ae032a3b35" (UID: "a5e5d46f-370a-410c-a86e-e0ae032a3b35"). InnerVolumeSpecName "kube-api-access-4h8w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.092512 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h8w9\" (UniqueName: \"kubernetes.io/projected/a5e5d46f-370a-410c-a86e-e0ae032a3b35-kube-api-access-4h8w9\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.518714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557776-fd952" event={"ID":"a5e5d46f-370a-410c-a86e-e0ae032a3b35","Type":"ContainerDied","Data":"262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61"} Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.518750 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557776-fd952" Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.518785 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262d11669e6956ab86098b9fcc102020c30e9abf8d9d39a7caa4adaa6f8f4e61" Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.521251 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"a6543158b26d5dbd238d8b8b4d83e6d4bfba990a2d88acc1131be0478ac5b8c1"} Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.521295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"bc1ada23caae136cc6e0b83242fbc3c379363b991201badedf6e5f1ecbe41bcc"} Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.521312 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"1f26df562dc17a7544f54e98f6f1896acdc8c2885efe83c898f23b615f1c17fa"} Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.926388 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557770-dmq2m"] Mar 14 05:36:06 crc kubenswrapper[4713]: I0314 05:36:06.930728 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557770-dmq2m"] Mar 14 05:36:07 crc kubenswrapper[4713]: I0314 05:36:07.540887 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"eb03a67f892e6d6b3e23a2723607752deb05dc53f345303b1e5eac688b5655f4"} Mar 14 05:36:07 crc kubenswrapper[4713]: I0314 05:36:07.540957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"8af41ea1f8e59bf62f9ccf840c1bb6a8173351de1cc56bc10e02d9379600c383"} Mar 14 05:36:07 crc kubenswrapper[4713]: I0314 05:36:07.540974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"249e8a7d-5c1c-4d41-a243-6ab0ad96094c","Type":"ContainerStarted","Data":"aa02efa34222bba535d250a6bbf94db15b0c2249858d87d4360a88d650da9569"} Mar 14 05:36:07 crc kubenswrapper[4713]: I0314 05:36:07.582023 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a03b8e-3b89-41c5-9399-a6ae0d44a53c" path="/var/lib/kubelet/pods/62a03b8e-3b89-41c5-9399-a6ae0d44a53c/volumes" Mar 14 05:36:07 crc kubenswrapper[4713]: I0314 05:36:07.587541 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.191047408 podStartE2EDuration="8.587503773s" podCreationTimestamp="2026-03-14 05:35:59 +0000 UTC" firstStartedPulling="2026-03-14 05:36:02.451094667 +0000 UTC m=+545.539003977" lastFinishedPulling="2026-03-14 05:36:05.847551042 +0000 UTC m=+548.935460342" observedRunningTime="2026-03-14 05:36:07.583788893 +0000 UTC m=+550.671698263" watchObservedRunningTime="2026-03-14 05:36:07.587503773 +0000 UTC m=+550.675413113" Mar 14 05:36:08 crc kubenswrapper[4713]: I0314 05:36:08.052695 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:36:08 crc kubenswrapper[4713]: I0314 05:36:08.052810 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:36:08 crc kubenswrapper[4713]: I0314 05:36:08.057125 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:36:08 crc kubenswrapper[4713]: I0314 05:36:08.553107 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:36:08 crc kubenswrapper[4713]: I0314 05:36:08.632635 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:36:09 crc kubenswrapper[4713]: I0314 05:36:09.739098 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:36:10 crc kubenswrapper[4713]: I0314 05:36:10.337802 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" Mar 14 05:36:18 crc kubenswrapper[4713]: I0314 05:36:18.695100 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:36:18 crc kubenswrapper[4713]: I0314 05:36:18.695536 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:36:33 crc kubenswrapper[4713]: I0314 05:36:33.679703 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rp4kf" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerName="console" containerID="cri-o://5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa" gracePeriod=15 Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.102448 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rp4kf_96bf650f-2c46-40aa-b26b-5d8a6df529fd/console/0.log" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.102847 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.247640 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.247780 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248434 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248478 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gldf\" (UniqueName: \"kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248508 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248528 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle\") pod \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\" (UID: \"96bf650f-2c46-40aa-b26b-5d8a6df529fd\") " Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248690 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.248833 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.249330 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config" (OuterVolumeSpecName: "console-config") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.249792 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca" (OuterVolumeSpecName: "service-ca") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.249981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.253607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.253638 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf" (OuterVolumeSpecName: "kube-api-access-2gldf") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "kube-api-access-2gldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.254155 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96bf650f-2c46-40aa-b26b-5d8a6df529fd" (UID: "96bf650f-2c46-40aa-b26b-5d8a6df529fd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349901 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349944 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349957 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349968 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gldf\" (UniqueName: \"kubernetes.io/projected/96bf650f-2c46-40aa-b26b-5d8a6df529fd-kube-api-access-2gldf\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349983 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96bf650f-2c46-40aa-b26b-5d8a6df529fd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.349994 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96bf650f-2c46-40aa-b26b-5d8a6df529fd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732314 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rp4kf_96bf650f-2c46-40aa-b26b-5d8a6df529fd/console/0.log" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732366 4713 generic.go:334] "Generic (PLEG): container finished" podID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerID="5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa" exitCode=2 Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rp4kf" event={"ID":"96bf650f-2c46-40aa-b26b-5d8a6df529fd","Type":"ContainerDied","Data":"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa"} Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rp4kf" event={"ID":"96bf650f-2c46-40aa-b26b-5d8a6df529fd","Type":"ContainerDied","Data":"1fd1cac48e0bebe17970877303621cd970416d7ca396606fed13d9bc2e39ad41"} Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732486 4713 scope.go:117] "RemoveContainer" containerID="5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.732508 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rp4kf" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.750817 4713 scope.go:117] "RemoveContainer" containerID="5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa" Mar 14 05:36:34 crc kubenswrapper[4713]: E0314 05:36:34.751340 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa\": container with ID starting with 5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa not found: ID does not exist" containerID="5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.751391 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa"} err="failed to get container status \"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa\": rpc error: code = NotFound desc = could not find container \"5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa\": container with ID starting with 5bf8551f559185e4ded4762302e116ede5995616c961e6458fcf9f7f801890aa not found: ID does not exist" Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.779069 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:36:34 crc kubenswrapper[4713]: I0314 05:36:34.784684 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rp4kf"] Mar 14 05:36:35 crc kubenswrapper[4713]: I0314 05:36:35.574391 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" path="/var/lib/kubelet/pods/96bf650f-2c46-40aa-b26b-5d8a6df529fd/volumes" Mar 14 05:36:38 crc kubenswrapper[4713]: I0314 05:36:38.703854 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:36:38 crc kubenswrapper[4713]: I0314 05:36:38.707761 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 05:36:59 crc kubenswrapper[4713]: I0314 05:36:59.738660 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:36:59 crc kubenswrapper[4713]: I0314 05:36:59.774343 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:37:00 crc kubenswrapper[4713]: I0314 05:37:00.022192 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 05:37:40 crc kubenswrapper[4713]: I0314 05:37:40.731250 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:37:40 crc kubenswrapper[4713]: I0314 05:37:40.732499 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.732998 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:37:50 crc kubenswrapper[4713]: E0314 05:37:50.733934 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e5d46f-370a-410c-a86e-e0ae032a3b35" containerName="oc" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.733952 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e5d46f-370a-410c-a86e-e0ae032a3b35" containerName="oc" Mar 14 05:37:50 crc kubenswrapper[4713]: E0314 05:37:50.733968 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerName="console" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.733977 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerName="console" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.734108 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bf650f-2c46-40aa-b26b-5d8a6df529fd" containerName="console" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.734132 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e5d46f-370a-410c-a86e-e0ae032a3b35" containerName="oc" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.734704 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.747715 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.765899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkvf\" (UniqueName: \"kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.765958 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.766016 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.766073 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.766101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.766155 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.766194 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866810 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866846 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkvf\" (UniqueName: \"kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.866977 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.867113 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.868228 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.868252 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.868769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.868873 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.872277 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.881085 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:50 crc kubenswrapper[4713]: I0314 05:37:50.899821 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkvf\" (UniqueName: \"kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf\") pod \"console-764cb777d4-qh4kb\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:51 crc kubenswrapper[4713]: I0314 05:37:51.052254 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:37:51 crc kubenswrapper[4713]: I0314 05:37:51.445353 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:37:52 crc kubenswrapper[4713]: I0314 05:37:52.358403 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764cb777d4-qh4kb" event={"ID":"30662ccb-2b29-409a-8ccc-6e68c5d7435a","Type":"ContainerStarted","Data":"b7c3030a8d99f0fa0ce902c3ef3ab3b397456026b6cd10a0e7cfc1344cade9ce"} Mar 14 05:37:52 crc kubenswrapper[4713]: I0314 05:37:52.358749 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764cb777d4-qh4kb" event={"ID":"30662ccb-2b29-409a-8ccc-6e68c5d7435a","Type":"ContainerStarted","Data":"8de2213c744755c6a63c002e363799ad0674ebf98a2fc2084b943c826b04e24c"} Mar 14 05:37:52 crc kubenswrapper[4713]: I0314 05:37:52.378698 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764cb777d4-qh4kb" podStartSLOduration=2.378662855 podStartE2EDuration="2.378662855s" podCreationTimestamp="2026-03-14 05:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:37:52.378570232 +0000 UTC m=+655.466479542" watchObservedRunningTime="2026-03-14 05:37:52.378662855 +0000 UTC m=+655.466572155" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.139470 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557778-9hq7z"] Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.140981 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.143788 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.144422 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.144439 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.161864 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557778-9hq7z"] Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.306837 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjj9\" (UniqueName: \"kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9\") pod \"auto-csr-approver-29557778-9hq7z\" (UID: \"2e734759-d71c-4eec-9f82-fe7c4be7c9a6\") " pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.408921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjj9\" (UniqueName: \"kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9\") pod \"auto-csr-approver-29557778-9hq7z\" (UID: \"2e734759-d71c-4eec-9f82-fe7c4be7c9a6\") " pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.439605 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjj9\" (UniqueName: \"kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9\") pod \"auto-csr-approver-29557778-9hq7z\" (UID: \"2e734759-d71c-4eec-9f82-fe7c4be7c9a6\") " pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.460632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:00 crc kubenswrapper[4713]: I0314 05:38:00.689335 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557778-9hq7z"] Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.052970 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.053016 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.059444 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.423302 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" event={"ID":"2e734759-d71c-4eec-9f82-fe7c4be7c9a6","Type":"ContainerStarted","Data":"5e0c8a916f4083576da359121e287dcba9639b1cfc28f32b68810e1934d502af"} Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.428232 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:38:01 crc kubenswrapper[4713]: I0314 05:38:01.482812 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:38:02 crc kubenswrapper[4713]: I0314 05:38:02.430219 4713 generic.go:334] "Generic (PLEG): container finished" podID="2e734759-d71c-4eec-9f82-fe7c4be7c9a6" containerID="73fd7985b8e9adbc9a5e6db119574fb4934888709ae2a92746dbe020455caffb" exitCode=0 Mar 14 05:38:02 crc kubenswrapper[4713]: I0314 05:38:02.430325 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" event={"ID":"2e734759-d71c-4eec-9f82-fe7c4be7c9a6","Type":"ContainerDied","Data":"73fd7985b8e9adbc9a5e6db119574fb4934888709ae2a92746dbe020455caffb"} Mar 14 05:38:03 crc kubenswrapper[4713]: I0314 05:38:03.639272 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:03 crc kubenswrapper[4713]: I0314 05:38:03.759833 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hjj9\" (UniqueName: \"kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9\") pod \"2e734759-d71c-4eec-9f82-fe7c4be7c9a6\" (UID: \"2e734759-d71c-4eec-9f82-fe7c4be7c9a6\") " Mar 14 05:38:03 crc kubenswrapper[4713]: I0314 05:38:03.766189 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9" (OuterVolumeSpecName: "kube-api-access-7hjj9") pod "2e734759-d71c-4eec-9f82-fe7c4be7c9a6" (UID: "2e734759-d71c-4eec-9f82-fe7c4be7c9a6"). InnerVolumeSpecName "kube-api-access-7hjj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:38:03 crc kubenswrapper[4713]: I0314 05:38:03.861416 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hjj9\" (UniqueName: \"kubernetes.io/projected/2e734759-d71c-4eec-9f82-fe7c4be7c9a6-kube-api-access-7hjj9\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:04 crc kubenswrapper[4713]: I0314 05:38:04.442489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" event={"ID":"2e734759-d71c-4eec-9f82-fe7c4be7c9a6","Type":"ContainerDied","Data":"5e0c8a916f4083576da359121e287dcba9639b1cfc28f32b68810e1934d502af"} Mar 14 05:38:04 crc kubenswrapper[4713]: I0314 05:38:04.442803 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0c8a916f4083576da359121e287dcba9639b1cfc28f32b68810e1934d502af" Mar 14 05:38:04 crc kubenswrapper[4713]: I0314 05:38:04.442560 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557778-9hq7z" Mar 14 05:38:04 crc kubenswrapper[4713]: I0314 05:38:04.696121 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557772-flfn4"] Mar 14 05:38:04 crc kubenswrapper[4713]: I0314 05:38:04.701283 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557772-flfn4"] Mar 14 05:38:05 crc kubenswrapper[4713]: I0314 05:38:05.570434 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53a961b-784c-4859-a121-59718c4c44a7" path="/var/lib/kubelet/pods/d53a961b-784c-4859-a121-59718c4c44a7/volumes" Mar 14 05:38:10 crc kubenswrapper[4713]: I0314 05:38:10.191282 4713 scope.go:117] "RemoveContainer" containerID="346e347bb9286f36a62d6ab11154e79a291c26ad5e8da9f5ef749a0cca2f6397" Mar 14 05:38:10 crc kubenswrapper[4713]: I0314 05:38:10.248228 4713 scope.go:117] "RemoveContainer" containerID="1432c561d4c9474ed2598f4d46956f32e1aa324b223448f9db9515c236a70a3d" Mar 14 05:38:10 crc kubenswrapper[4713]: I0314 05:38:10.731694 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:38:10 crc kubenswrapper[4713]: I0314 05:38:10.732017 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:38:26 crc kubenswrapper[4713]: I0314 05:38:26.527224 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-76b6b6b6bb-wvhf8" podUID="69b5585b-519e-4399-b6ef-f4e8d0641705" containerName="console" containerID="cri-o://5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e" gracePeriod=15 Mar 14 05:38:26 crc kubenswrapper[4713]: I0314 05:38:26.875913 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b6b6b6bb-wvhf8_69b5585b-519e-4399-b6ef-f4e8d0641705/console/0.log" Mar 14 05:38:26 crc kubenswrapper[4713]: I0314 05:38:26.876383 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.077625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.077681 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.077708 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.077727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078404 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078672 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv2hr\" (UniqueName: \"kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr\") pod \"69b5585b-519e-4399-b6ef-f4e8d0641705\" (UID: \"69b5585b-519e-4399-b6ef-f4e8d0641705\") " Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078737 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078755 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca" (OuterVolumeSpecName: "service-ca") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078785 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config" (OuterVolumeSpecName: "console-config") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.078839 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.079195 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.079253 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.079271 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.079285 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b5585b-519e-4399-b6ef-f4e8d0641705-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.083683 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.083726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr" (OuterVolumeSpecName: "kube-api-access-jv2hr") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "kube-api-access-jv2hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.086358 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "69b5585b-519e-4399-b6ef-f4e8d0641705" (UID: "69b5585b-519e-4399-b6ef-f4e8d0641705"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.181130 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.181178 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv2hr\" (UniqueName: \"kubernetes.io/projected/69b5585b-519e-4399-b6ef-f4e8d0641705-kube-api-access-jv2hr\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.181191 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b5585b-519e-4399-b6ef-f4e8d0641705-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590438 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b6b6b6bb-wvhf8_69b5585b-519e-4399-b6ef-f4e8d0641705/console/0.log" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590484 4713 generic.go:334] "Generic (PLEG): container finished" podID="69b5585b-519e-4399-b6ef-f4e8d0641705" containerID="5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e" exitCode=2 Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590517 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b6b6b6bb-wvhf8" event={"ID":"69b5585b-519e-4399-b6ef-f4e8d0641705","Type":"ContainerDied","Data":"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e"} Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b6b6b6bb-wvhf8" event={"ID":"69b5585b-519e-4399-b6ef-f4e8d0641705","Type":"ContainerDied","Data":"0189fe34c4e4e6063568966ccd4d19abbb552303bab35142fb0a687782b10e21"} Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590541 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b6b6b6bb-wvhf8" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.590650 4713 scope.go:117] "RemoveContainer" containerID="5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.609236 4713 scope.go:117] "RemoveContainer" containerID="5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e" Mar 14 05:38:27 crc kubenswrapper[4713]: E0314 05:38:27.609978 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e\": container with ID starting with 5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e not found: ID does not exist" containerID="5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.610046 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e"} err="failed to get container status \"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e\": rpc error: code = NotFound desc = could not find container \"5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e\": container with ID starting with 5f872b67b04830cf756fb4403b8426fe807e8cdb0c8f752706a34bbe5ebf416e not found: ID does not exist" Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.610108 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:38:27 crc kubenswrapper[4713]: I0314 05:38:27.620160 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76b6b6b6bb-wvhf8"] Mar 14 05:38:29 crc kubenswrapper[4713]: I0314 05:38:29.571391 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b5585b-519e-4399-b6ef-f4e8d0641705" path="/var/lib/kubelet/pods/69b5585b-519e-4399-b6ef-f4e8d0641705/volumes" Mar 14 05:38:40 crc kubenswrapper[4713]: I0314 05:38:40.731490 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:38:40 crc kubenswrapper[4713]: I0314 05:38:40.732065 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:38:40 crc kubenswrapper[4713]: I0314 05:38:40.732120 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:38:40 crc kubenswrapper[4713]: I0314 05:38:40.732771 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:38:40 crc kubenswrapper[4713]: I0314 05:38:40.732834 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673" gracePeriod=600 Mar 14 05:38:41 crc kubenswrapper[4713]: I0314 05:38:41.695341 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673" exitCode=0 Mar 14 05:38:41 crc kubenswrapper[4713]: I0314 05:38:41.695406 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673"} Mar 14 05:38:41 crc kubenswrapper[4713]: I0314 05:38:41.696139 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea"} Mar 14 05:38:41 crc kubenswrapper[4713]: I0314 05:38:41.696180 4713 scope.go:117] "RemoveContainer" containerID="3178c6299ee5084d508d01472b2baefd0a7f8c581742b5a075487a50da502998" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.087394 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557780-d2clc"] Mar 14 05:40:01 crc kubenswrapper[4713]: E0314 05:40:01.088740 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e734759-d71c-4eec-9f82-fe7c4be7c9a6" containerName="oc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.088757 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e734759-d71c-4eec-9f82-fe7c4be7c9a6" containerName="oc" Mar 14 05:40:01 crc kubenswrapper[4713]: E0314 05:40:01.088788 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b5585b-519e-4399-b6ef-f4e8d0641705" containerName="console" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.088796 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b5585b-519e-4399-b6ef-f4e8d0641705" containerName="console" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.088922 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b5585b-519e-4399-b6ef-f4e8d0641705" containerName="console" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.088944 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e734759-d71c-4eec-9f82-fe7c4be7c9a6" containerName="oc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.089479 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.094859 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.095190 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.095451 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.103279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557780-d2clc"] Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.226335 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhpsl\" (UniqueName: \"kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl\") pod \"auto-csr-approver-29557780-d2clc\" (UID: \"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356\") " pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.327952 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhpsl\" (UniqueName: \"kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl\") pod \"auto-csr-approver-29557780-d2clc\" (UID: \"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356\") " pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.347931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhpsl\" (UniqueName: \"kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl\") pod \"auto-csr-approver-29557780-d2clc\" (UID: \"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356\") " pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.413637 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:01 crc kubenswrapper[4713]: I0314 05:40:01.853735 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557780-d2clc"] Mar 14 05:40:02 crc kubenswrapper[4713]: I0314 05:40:02.059563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557780-d2clc" event={"ID":"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356","Type":"ContainerStarted","Data":"421bf879862a158bcf2fab62c07da7a9d1ec31f14413f7145db39bd5e3c9d394"} Mar 14 05:40:04 crc kubenswrapper[4713]: I0314 05:40:04.077180 4713 generic.go:334] "Generic (PLEG): container finished" podID="5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" containerID="1c1baa012cdbfb727466711887cec7f8e032ffc4fadb8a42bf3af7953ad0ef34" exitCode=0 Mar 14 05:40:04 crc kubenswrapper[4713]: I0314 05:40:04.077387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557780-d2clc" event={"ID":"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356","Type":"ContainerDied","Data":"1c1baa012cdbfb727466711887cec7f8e032ffc4fadb8a42bf3af7953ad0ef34"} Mar 14 05:40:05 crc kubenswrapper[4713]: I0314 05:40:05.376516 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:05 crc kubenswrapper[4713]: I0314 05:40:05.497948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhpsl\" (UniqueName: \"kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl\") pod \"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356\" (UID: \"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356\") " Mar 14 05:40:05 crc kubenswrapper[4713]: I0314 05:40:05.505675 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl" (OuterVolumeSpecName: "kube-api-access-dhpsl") pod "5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" (UID: "5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356"). InnerVolumeSpecName "kube-api-access-dhpsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:40:05 crc kubenswrapper[4713]: I0314 05:40:05.600255 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhpsl\" (UniqueName: \"kubernetes.io/projected/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356-kube-api-access-dhpsl\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:06 crc kubenswrapper[4713]: I0314 05:40:06.098256 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557780-d2clc" event={"ID":"5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356","Type":"ContainerDied","Data":"421bf879862a158bcf2fab62c07da7a9d1ec31f14413f7145db39bd5e3c9d394"} Mar 14 05:40:06 crc kubenswrapper[4713]: I0314 05:40:06.098379 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557780-d2clc" Mar 14 05:40:06 crc kubenswrapper[4713]: I0314 05:40:06.098391 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421bf879862a158bcf2fab62c07da7a9d1ec31f14413f7145db39bd5e3c9d394" Mar 14 05:40:06 crc kubenswrapper[4713]: I0314 05:40:06.455323 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557774-hl9gk"] Mar 14 05:40:06 crc kubenswrapper[4713]: I0314 05:40:06.459623 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557774-hl9gk"] Mar 14 05:40:07 crc kubenswrapper[4713]: I0314 05:40:07.581564 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce66bc3f-1fe6-495b-8d7c-7caaea776d81" path="/var/lib/kubelet/pods/ce66bc3f-1fe6-495b-8d7c-7caaea776d81/volumes" Mar 14 05:40:10 crc kubenswrapper[4713]: I0314 05:40:10.359889 4713 scope.go:117] "RemoveContainer" containerID="27e961efd45e801c359322a0df11b6c7d2d21836a53017b4a09d8777d9b87c5b" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.662006 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp"] Mar 14 05:40:34 crc kubenswrapper[4713]: E0314 05:40:34.663427 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" containerName="oc" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.663450 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" containerName="oc" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.663590 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" containerName="oc" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.664726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.670048 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.692003 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp"] Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.853665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.853753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghhv\" (UniqueName: \"kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.853806 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.954803 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.954905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.954979 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghhv\" (UniqueName: \"kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.955344 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.955453 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.980550 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghhv\" (UniqueName: \"kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:34 crc kubenswrapper[4713]: I0314 05:40:34.994576 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:36 crc kubenswrapper[4713]: I0314 05:40:36.291707 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp"] Mar 14 05:40:36 crc kubenswrapper[4713]: I0314 05:40:36.775388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerStarted","Data":"312bcce70c6f2e8577c39ef316c8521dd085ec8eca4005af01c9eea378e78e10"} Mar 14 05:40:36 crc kubenswrapper[4713]: I0314 05:40:36.775764 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerStarted","Data":"5e81584afcc8b997972f701ec5b3eec34e528fe870e30a016bd93cb45c09444b"} Mar 14 05:40:37 crc kubenswrapper[4713]: I0314 05:40:37.781471 4713 generic.go:334] "Generic (PLEG): container finished" podID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerID="312bcce70c6f2e8577c39ef316c8521dd085ec8eca4005af01c9eea378e78e10" exitCode=0 Mar 14 05:40:37 crc kubenswrapper[4713]: I0314 05:40:37.781513 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerDied","Data":"312bcce70c6f2e8577c39ef316c8521dd085ec8eca4005af01c9eea378e78e10"} Mar 14 05:40:39 crc kubenswrapper[4713]: I0314 05:40:39.870335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerStarted","Data":"8de57c917d71b2d672435fe244bd7c89855bbc6158c6d8b924caf0dae1fd09cb"} Mar 14 05:40:40 crc kubenswrapper[4713]: I0314 05:40:40.877785 4713 generic.go:334] "Generic (PLEG): container finished" podID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerID="8de57c917d71b2d672435fe244bd7c89855bbc6158c6d8b924caf0dae1fd09cb" exitCode=0 Mar 14 05:40:40 crc kubenswrapper[4713]: I0314 05:40:40.877844 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerDied","Data":"8de57c917d71b2d672435fe244bd7c89855bbc6158c6d8b924caf0dae1fd09cb"} Mar 14 05:40:41 crc kubenswrapper[4713]: I0314 05:40:41.887602 4713 generic.go:334] "Generic (PLEG): container finished" podID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerID="5bfd914092c1e4322681fd71e6ce07109413f7066e73a557d0b3f9215fe5039d" exitCode=0 Mar 14 05:40:41 crc kubenswrapper[4713]: I0314 05:40:41.887683 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerDied","Data":"5bfd914092c1e4322681fd71e6ce07109413f7066e73a557d0b3f9215fe5039d"} Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.153706 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.282900 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util\") pod \"a917bf05-be84-43b9-ae72-d8b59822aeaf\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.283064 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle\") pod \"a917bf05-be84-43b9-ae72-d8b59822aeaf\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.283105 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mghhv\" (UniqueName: \"kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv\") pod \"a917bf05-be84-43b9-ae72-d8b59822aeaf\" (UID: \"a917bf05-be84-43b9-ae72-d8b59822aeaf\") " Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.285018 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle" (OuterVolumeSpecName: "bundle") pod "a917bf05-be84-43b9-ae72-d8b59822aeaf" (UID: "a917bf05-be84-43b9-ae72-d8b59822aeaf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.287728 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv" (OuterVolumeSpecName: "kube-api-access-mghhv") pod "a917bf05-be84-43b9-ae72-d8b59822aeaf" (UID: "a917bf05-be84-43b9-ae72-d8b59822aeaf"). InnerVolumeSpecName "kube-api-access-mghhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.294439 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util" (OuterVolumeSpecName: "util") pod "a917bf05-be84-43b9-ae72-d8b59822aeaf" (UID: "a917bf05-be84-43b9-ae72-d8b59822aeaf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.384398 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.384433 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a917bf05-be84-43b9-ae72-d8b59822aeaf-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.384442 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mghhv\" (UniqueName: \"kubernetes.io/projected/a917bf05-be84-43b9-ae72-d8b59822aeaf-kube-api-access-mghhv\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.905721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" event={"ID":"a917bf05-be84-43b9-ae72-d8b59822aeaf","Type":"ContainerDied","Data":"5e81584afcc8b997972f701ec5b3eec34e528fe870e30a016bd93cb45c09444b"} Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.905809 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e81584afcc8b997972f701ec5b3eec34e528fe870e30a016bd93cb45c09444b" Mar 14 05:40:43 crc kubenswrapper[4713]: I0314 05:40:43.905929 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.756680 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4ds64"] Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.757388 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-controller" containerID="cri-o://03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758233 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="sbdb" containerID="cri-o://4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758286 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="nbdb" containerID="cri-o://3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758321 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="northd" containerID="cri-o://80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758358 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758392 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-node" containerID="cri-o://1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.758431 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-acl-logging" containerID="cri-o://94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.801795 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" containerID="cri-o://6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848" gracePeriod=30 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.964709 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/3.log" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.969455 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-acl-logging/0.log" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.969961 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-controller/0.log" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.975481 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3" exitCode=143 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.975522 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522" exitCode=143 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.975592 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3"} Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.975626 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522"} Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.992135 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/2.log" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.998619 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/1.log" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.998672 4713 generic.go:334] "Generic (PLEG): container finished" podID="703b6542-1a83-442a-9673-6a774399dd7e" containerID="585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be" exitCode=2 Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.998709 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerDied","Data":"585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be"} Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.998748 4713 scope.go:117] "RemoveContainer" containerID="430ac204c773fa77f410577e29400c5c9e6a5e4247e3c1eb5727da2386b81ad8" Mar 14 05:40:45 crc kubenswrapper[4713]: I0314 05:40:45.999448 4713 scope.go:117] "RemoveContainer" containerID="585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be" Mar 14 05:40:46 crc kubenswrapper[4713]: E0314 05:40:46.045441 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703b6542_1a83_442a_9673_6a774399dd7e.slice/crio-585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6632626e_d806_4de3_b20a_6ee10099a464.slice/crio-4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703b6542_1a83_442a_9673_6a774399dd7e.slice/crio-conmon-585e486410f622ff2293f2ac4863100ffe11ca3f5279fb10a5283757080639be.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6632626e_d806_4de3_b20a_6ee10099a464.slice/crio-6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6632626e_d806_4de3_b20a_6ee10099a464.slice/crio-conmon-4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.006530 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5l5jq_703b6542-1a83-442a-9673-6a774399dd7e/kube-multus/2.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.006881 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5l5jq" event={"ID":"703b6542-1a83-442a-9673-6a774399dd7e","Type":"ContainerStarted","Data":"8832bed43f583fa553ed5ba346e3c0e36ccc7e71d6e6ba42fd365069165879ff"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.009698 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovnkube-controller/3.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.011958 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-acl-logging/0.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012457 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-controller/0.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012825 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012846 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012877 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012885 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012894 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012901 4713 generic.go:334] "Generic (PLEG): container finished" podID="6632626e-d806-4de3-b20a-6ee10099a464" containerID="1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0" exitCode=0 Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.012923 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013048 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013123 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0"} Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.013143 4713 scope.go:117] "RemoveContainer" containerID="36265730a75d3416c0b0a8d9eb40c1d86e5891f5653df6bede643330b2290339" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.498466 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-acl-logging/0.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.498973 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-controller/0.log" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.499479 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.635688 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tb6w5"] Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.635955 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.635973 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.635991 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="sbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636000 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="sbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636018 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kubecfg-setup" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636028 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kubecfg-setup" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636037 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636045 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636053 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="util" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636062 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="util" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636071 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="pull" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636080 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="pull" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636089 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636097 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636108 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="northd" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636116 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="northd" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636132 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="nbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636140 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="nbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636151 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636158 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636165 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636171 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636180 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636186 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636197 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-node" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636221 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-node" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636232 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-acl-logging" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636238 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-acl-logging" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636251 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="extract" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636256 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="extract" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636367 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636380 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636388 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636394 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636404 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="northd" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636412 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636419 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636427 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="kube-rbac-proxy-node" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636434 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="sbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636441 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovn-acl-logging" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636449 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="nbdb" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636455 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a917bf05-be84-43b9-ae72-d8b59822aeaf" containerName="extract" Mar 14 05:40:47 crc kubenswrapper[4713]: E0314 05:40:47.636563 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636570 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.636685 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6632626e-d806-4de3-b20a-6ee10099a464" containerName="ovnkube-controller" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.638607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.648896 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.648931 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.648965 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649015 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649013 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649047 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log" (OuterVolumeSpecName: "node-log") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649027 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tztwk\" (UniqueName: \"kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649333 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649352 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649878 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649910 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649944 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649961 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649981 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.649994 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650044 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650062 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650081 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650162 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650080 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650182 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn\") pod \"6632626e-d806-4de3-b20a-6ee10099a464\" (UID: \"6632626e-d806-4de3-b20a-6ee10099a464\") " Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650321 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-var-lib-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-env-overrides\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650382 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650401 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsl6\" (UniqueName: \"kubernetes.io/projected/fbc093d9-92a6-4316-b7d0-761dafaec95b-kube-api-access-hdsl6\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650415 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-node-log\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650104 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash" (OuterVolumeSpecName: "host-slash") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650129 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650163 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650194 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650244 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650258 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650278 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket" (OuterVolumeSpecName: "log-socket") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650301 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650360 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.650514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663288 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-ovn\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663664 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-script-lib\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663714 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovn-node-metrics-cert\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-slash\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663809 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-etc-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663836 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663888 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-bin\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663926 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-config\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.663968 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-log-socket\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-systemd-units\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664046 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-netns\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664089 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-systemd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664088 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk" (OuterVolumeSpecName: "kube-api-access-tztwk") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "kube-api-access-tztwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-netd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664845 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.664943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-kubelet\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665315 4713 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665430 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665512 4713 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-log-socket\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665588 4713 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665663 4713 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-node-log\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665735 4713 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665849 4713 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.665931 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tztwk\" (UniqueName: \"kubernetes.io/projected/6632626e-d806-4de3-b20a-6ee10099a464-kube-api-access-tztwk\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666005 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666076 4713 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666148 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666241 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6632626e-d806-4de3-b20a-6ee10099a464-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666317 4713 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-slash\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666399 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666472 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666555 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666638 4713 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666716 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6632626e-d806-4de3-b20a-6ee10099a464-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.666790 4713 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.682656 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6632626e-d806-4de3-b20a-6ee10099a464" (UID: "6632626e-d806-4de3-b20a-6ee10099a464"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-etc-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-bin\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768286 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768309 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-config\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-log-socket\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768365 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-systemd-units\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-netns\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768419 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-systemd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768441 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-netd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768464 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-kubelet\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768489 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-var-lib-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-env-overrides\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768578 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsl6\" (UniqueName: \"kubernetes.io/projected/fbc093d9-92a6-4316-b7d0-761dafaec95b-kube-api-access-hdsl6\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-node-log\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768630 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-ovn\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768659 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-script-lib\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovn-node-metrics-cert\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-slash\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768773 4713 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6632626e-d806-4de3-b20a-6ee10099a464-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768823 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-slash\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-etc-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768901 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-bin\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.768958 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769678 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-config\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-log-socket\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769811 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-systemd-units\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769843 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-netns\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769871 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-systemd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769900 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-cni-netd\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-kubelet\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.769959 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-var-lib-openvswitch\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.770377 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-env-overrides\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.770429 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-host-run-ovn-kubernetes\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.770722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-node-log\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.770767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbc093d9-92a6-4316-b7d0-761dafaec95b-run-ovn\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.771334 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovnkube-script-lib\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.779543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbc093d9-92a6-4316-b7d0-761dafaec95b-ovn-node-metrics-cert\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.790919 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsl6\" (UniqueName: \"kubernetes.io/projected/fbc093d9-92a6-4316-b7d0-761dafaec95b-kube-api-access-hdsl6\") pod \"ovnkube-node-tb6w5\" (UID: \"fbc093d9-92a6-4316-b7d0-761dafaec95b\") " pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: I0314 05:40:47.964867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:47 crc kubenswrapper[4713]: W0314 05:40:47.991768 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbc093d9_92a6_4316_b7d0_761dafaec95b.slice/crio-933f97e28aaff0e62e90d2dc84b190b45072b0260a3f63986290338052b28bc4 WatchSource:0}: Error finding container 933f97e28aaff0e62e90d2dc84b190b45072b0260a3f63986290338052b28bc4: Status 404 returned error can't find the container with id 933f97e28aaff0e62e90d2dc84b190b45072b0260a3f63986290338052b28bc4 Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.022130 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-acl-logging/0.log" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.022555 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4ds64_6632626e-d806-4de3-b20a-6ee10099a464/ovn-controller/0.log" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.022959 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" event={"ID":"6632626e-d806-4de3-b20a-6ee10099a464","Type":"ContainerDied","Data":"f92c9f70bf67c34a4db97cec7349813e72f81c1680938ea3ea88419f027eb41b"} Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.023017 4713 scope.go:117] "RemoveContainer" containerID="6727dd3495d1353710973a463cb317f3608479ca0b89ddd20fc9a6bb11606848" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.023026 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ds64" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.024385 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"933f97e28aaff0e62e90d2dc84b190b45072b0260a3f63986290338052b28bc4"} Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.060400 4713 scope.go:117] "RemoveContainer" containerID="4564bcce7d64c27524871cacb4e7c6ab82acd1dc844d4f85fa6a055087df62c1" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.077994 4713 scope.go:117] "RemoveContainer" containerID="3b23c8b8f262a9e0998caf08838d07c68ae63393b5d634a0526b301166647639" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.116908 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4ds64"] Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.122389 4713 scope.go:117] "RemoveContainer" containerID="80da518f4a3fdd4cb3498cf5621d6a92d48c4cd436ec56be0a568b3b75004110" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.132191 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4ds64"] Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.145370 4713 scope.go:117] "RemoveContainer" containerID="8abf8d26ea5efc31b30bd5a5e96feb1374c009b942e8eb34e97d599526f956ff" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.173484 4713 scope.go:117] "RemoveContainer" containerID="1e938255444d86edc27bfa8da213de947d10f58bfb24811fb29e4130ea6624a0" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.230957 4713 scope.go:117] "RemoveContainer" containerID="94912f697aac8bb6bdefae21a8961d1b83f503cb0c722c3732d0b45437ece7a3" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.249307 4713 scope.go:117] "RemoveContainer" containerID="03c9283a7d68d2246cf489ff05e385729ee03a9802e6cf463eaded6c2ac84522" Mar 14 05:40:48 crc kubenswrapper[4713]: I0314 05:40:48.265887 4713 scope.go:117] "RemoveContainer" containerID="245ad19a9f1cbea72c86c724506544c9df3edbe510d085bd28af378ca0c1a53c" Mar 14 05:40:49 crc kubenswrapper[4713]: I0314 05:40:49.034034 4713 generic.go:334] "Generic (PLEG): container finished" podID="fbc093d9-92a6-4316-b7d0-761dafaec95b" containerID="e57e2066f0d17dbe00376638ea9010540191e4cc8488fa5269334d14de556da5" exitCode=0 Mar 14 05:40:49 crc kubenswrapper[4713]: I0314 05:40:49.034094 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerDied","Data":"e57e2066f0d17dbe00376638ea9010540191e4cc8488fa5269334d14de556da5"} Mar 14 05:40:49 crc kubenswrapper[4713]: I0314 05:40:49.570796 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6632626e-d806-4de3-b20a-6ee10099a464" path="/var/lib/kubelet/pods/6632626e-d806-4de3-b20a-6ee10099a464/volumes" Mar 14 05:40:50 crc kubenswrapper[4713]: I0314 05:40:50.042280 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"52671f2c34d62917bf14601166cbec2f5ccbf2824db6d26c30bbdf50d30f8d4b"} Mar 14 05:40:50 crc kubenswrapper[4713]: I0314 05:40:50.042321 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"05e9731bb458fa51093bbd7ef775ab62289bf9342939a5f944680582d858c53a"} Mar 14 05:40:50 crc kubenswrapper[4713]: I0314 05:40:50.042331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"d8845083ff42cb6904143d76faa647dd46309650933ec0938b0f384fed50d994"} Mar 14 05:40:50 crc kubenswrapper[4713]: I0314 05:40:50.042340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"06a2b51327a3638737b127a39d8ef46458dc3aac90524a69d26a7b1a3dbd80c0"} Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.052799 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"eceaab35283d9d8ba9116ce5ea393778122d3587a0a045c8c738ac4f1ee04089"} Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.053106 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"0c4e6905c43008f003ad12159531c4c804319649d4c7a9f1bb03983f6d9b9291"} Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.907564 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl"] Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.908484 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.910193 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x5xn6" Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.910309 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.912604 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 14 05:40:51 crc kubenswrapper[4713]: I0314 05:40:51.919523 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2xrk\" (UniqueName: \"kubernetes.io/projected/428d4860-9850-46cc-82c8-5fcf46c06748-kube-api-access-s2xrk\") pod \"obo-prometheus-operator-68bc856cb9-kdwsl\" (UID: \"428d4860-9850-46cc-82c8-5fcf46c06748\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.020423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2xrk\" (UniqueName: \"kubernetes.io/projected/428d4860-9850-46cc-82c8-5fcf46c06748-kube-api-access-s2xrk\") pod \"obo-prometheus-operator-68bc856cb9-kdwsl\" (UID: \"428d4860-9850-46cc-82c8-5fcf46c06748\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.049007 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2xrk\" (UniqueName: \"kubernetes.io/projected/428d4860-9850-46cc-82c8-5fcf46c06748-kube-api-access-s2xrk\") pod \"obo-prometheus-operator-68bc856cb9-kdwsl\" (UID: \"428d4860-9850-46cc-82c8-5fcf46c06748\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.107856 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d"] Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.108751 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.110475 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.111050 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-5cz8s" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.115605 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl"] Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.116448 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.121697 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.121954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.222781 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.222837 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.222869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.222949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.223766 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.228766 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.229094 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f90d6ef-cc15-4d38-a2c5-f5e778500c73-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d\" (UID: \"8f90d6ef-cc15-4d38-a2c5-f5e778500c73\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.236781 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-wnccn"] Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.237514 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.239301 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-9jlqz" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.239694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.271490 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(85153e6f51368d91baa233ff61a9d5f4fcbb3771b495dd57582b6d99fa8ad889): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.271552 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(85153e6f51368d91baa233ff61a9d5f4fcbb3771b495dd57582b6d99fa8ad889): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.271621 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(85153e6f51368d91baa233ff61a9d5f4fcbb3771b495dd57582b6d99fa8ad889): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.271660 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators(428d4860-9850-46cc-82c8-5fcf46c06748)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators(428d4860-9850-46cc-82c8-5fcf46c06748)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(85153e6f51368d91baa233ff61a9d5f4fcbb3771b495dd57582b6d99fa8ad889): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" podUID="428d4860-9850-46cc-82c8-5fcf46c06748" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.324088 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.324161 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.327467 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.327810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bce1139-ffd0-4518-8f2c-7ef46b2892e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl\" (UID: \"9bce1139-ffd0-4518-8f2c-7ef46b2892e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.425934 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hms\" (UniqueName: \"kubernetes.io/projected/94841b22-b2eb-4519-b04f-98010d848b46-kube-api-access-t7hms\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.426061 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/94841b22-b2eb-4519-b04f-98010d848b46-observability-operator-tls\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.432589 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.441305 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.462316 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(3ffe0498996326d52a662a419136f725cdc2ebbbde8b1a5ab6075d9d6e73afda): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.462390 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(3ffe0498996326d52a662a419136f725cdc2ebbbde8b1a5ab6075d9d6e73afda): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.462420 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(3ffe0498996326d52a662a419136f725cdc2ebbbde8b1a5ab6075d9d6e73afda): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.462473 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators(8f90d6ef-cc15-4d38-a2c5-f5e778500c73)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators(8f90d6ef-cc15-4d38-a2c5-f5e778500c73)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(3ffe0498996326d52a662a419136f725cdc2ebbbde8b1a5ab6075d9d6e73afda): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" podUID="8f90d6ef-cc15-4d38-a2c5-f5e778500c73" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.482183 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hqw7p"] Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.482844 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(de1e10b6d15629afea053c53029a67966ce9b19cfd91b0fadabb020907e98baf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.482918 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(de1e10b6d15629afea053c53029a67966ce9b19cfd91b0fadabb020907e98baf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.482939 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(de1e10b6d15629afea053c53029a67966ce9b19cfd91b0fadabb020907e98baf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.482982 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators(9bce1139-ffd0-4518-8f2c-7ef46b2892e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators(9bce1139-ffd0-4518-8f2c-7ef46b2892e4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(de1e10b6d15629afea053c53029a67966ce9b19cfd91b0fadabb020907e98baf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" podUID="9bce1139-ffd0-4518-8f2c-7ef46b2892e4" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.483113 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.485693 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zrq46" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.527110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/94841b22-b2eb-4519-b04f-98010d848b46-observability-operator-tls\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.527218 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hms\" (UniqueName: \"kubernetes.io/projected/94841b22-b2eb-4519-b04f-98010d848b46-kube-api-access-t7hms\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.530286 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/94841b22-b2eb-4519-b04f-98010d848b46-observability-operator-tls\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.541464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hms\" (UniqueName: \"kubernetes.io/projected/94841b22-b2eb-4519-b04f-98010d848b46-kube-api-access-t7hms\") pod \"observability-operator-59bdc8b94-wnccn\" (UID: \"94841b22-b2eb-4519-b04f-98010d848b46\") " pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.605418 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.626493 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(988137c3e56c6c1b6f3f75ff133e34be99e6b58015b0376ecd49ee6d4ad29082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.626553 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(988137c3e56c6c1b6f3f75ff133e34be99e6b58015b0376ecd49ee6d4ad29082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.626576 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(988137c3e56c6c1b6f3f75ff133e34be99e6b58015b0376ecd49ee6d4ad29082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.626621 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-wnccn_openshift-operators(94841b22-b2eb-4519-b04f-98010d848b46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-wnccn_openshift-operators(94841b22-b2eb-4519-b04f-98010d848b46)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(988137c3e56c6c1b6f3f75ff133e34be99e6b58015b0376ecd49ee6d4ad29082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.628869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtq8\" (UniqueName: \"kubernetes.io/projected/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-kube-api-access-fqtq8\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.628930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.730123 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtq8\" (UniqueName: \"kubernetes.io/projected/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-kube-api-access-fqtq8\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.730237 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.731298 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.755117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtq8\" (UniqueName: \"kubernetes.io/projected/f9178880-ef43-43c5-8e91-f4c46d4aa0c6-kube-api-access-fqtq8\") pod \"perses-operator-5bf474d74f-hqw7p\" (UID: \"f9178880-ef43-43c5-8e91-f4c46d4aa0c6\") " pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: I0314 05:40:52.801810 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.823558 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(8b09f97a822b84908585968e11951758586356fed081ac7b7da317406a1e3f7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.823620 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(8b09f97a822b84908585968e11951758586356fed081ac7b7da317406a1e3f7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.823642 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(8b09f97a822b84908585968e11951758586356fed081ac7b7da317406a1e3f7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:52 crc kubenswrapper[4713]: E0314 05:40:52.823688 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-hqw7p_openshift-operators(f9178880-ef43-43c5-8e91-f4c46d4aa0c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-hqw7p_openshift-operators(f9178880-ef43-43c5-8e91-f4c46d4aa0c6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(8b09f97a822b84908585968e11951758586356fed081ac7b7da317406a1e3f7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.087955 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"caab5b0a08fdd1fff31644b7e0c4faf279a9846f5f10d59ce455288283ff238e"} Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.977954 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-wnccn"] Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.978107 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.978595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.981731 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl"] Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.981871 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.982416 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.986268 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl"] Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.986413 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.986924 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.990013 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hqw7p"] Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.990140 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:54 crc kubenswrapper[4713]: I0314 05:40:54.990588 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.041282 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d"] Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.041660 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.042133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.092400 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(ca8cccaec0e0583d00dba16d24e7c3c08f730855326492c1639ef4833db1ae69): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.092468 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(ca8cccaec0e0583d00dba16d24e7c3c08f730855326492c1639ef4833db1ae69): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.092494 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(ca8cccaec0e0583d00dba16d24e7c3c08f730855326492c1639ef4833db1ae69): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.092542 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators(428d4860-9850-46cc-82c8-5fcf46c06748)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators(428d4860-9850-46cc-82c8-5fcf46c06748)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-kdwsl_openshift-operators_428d4860-9850-46cc-82c8-5fcf46c06748_0(ca8cccaec0e0583d00dba16d24e7c3c08f730855326492c1639ef4833db1ae69): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" podUID="428d4860-9850-46cc-82c8-5fcf46c06748" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.121803 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(d464666b4acb197eef0c5c107c3ad7c37a5258c543ccd0ad988c860e57d32990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.121882 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(d464666b4acb197eef0c5c107c3ad7c37a5258c543ccd0ad988c860e57d32990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.121908 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(d464666b4acb197eef0c5c107c3ad7c37a5258c543ccd0ad988c860e57d32990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.121956 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators(9bce1139-ffd0-4518-8f2c-7ef46b2892e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators(9bce1139-ffd0-4518-8f2c-7ef46b2892e4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_openshift-operators_9bce1139-ffd0-4518-8f2c-7ef46b2892e4_0(d464666b4acb197eef0c5c107c3ad7c37a5258c543ccd0ad988c860e57d32990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" podUID="9bce1139-ffd0-4518-8f2c-7ef46b2892e4" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.140346 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(a4d9952450a6e96714f90c854c757d4b8d7f07cfceac63404a35894627a3ae5d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.140407 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(a4d9952450a6e96714f90c854c757d4b8d7f07cfceac63404a35894627a3ae5d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.140429 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(a4d9952450a6e96714f90c854c757d4b8d7f07cfceac63404a35894627a3ae5d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.140478 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-wnccn_openshift-operators(94841b22-b2eb-4519-b04f-98010d848b46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-wnccn_openshift-operators(94841b22-b2eb-4519-b04f-98010d848b46)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-wnccn_openshift-operators_94841b22-b2eb-4519-b04f-98010d848b46_0(a4d9952450a6e96714f90c854c757d4b8d7f07cfceac63404a35894627a3ae5d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.158323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" event={"ID":"fbc093d9-92a6-4316-b7d0-761dafaec95b","Type":"ContainerStarted","Data":"7b67a2316e55ddaa89f384cb4071226a5acc4618edac93ea9956bcad066add98"} Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.159409 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.159436 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.159479 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.175732 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(36dfcfe9957fd558ad52b73e58a9dbaae3cc133107a0fee5d9974d8f24c9375f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.175789 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(36dfcfe9957fd558ad52b73e58a9dbaae3cc133107a0fee5d9974d8f24c9375f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.175997 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(36dfcfe9957fd558ad52b73e58a9dbaae3cc133107a0fee5d9974d8f24c9375f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.176042 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-hqw7p_openshift-operators(f9178880-ef43-43c5-8e91-f4c46d4aa0c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-hqw7p_openshift-operators(f9178880-ef43-43c5-8e91-f4c46d4aa0c6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-hqw7p_openshift-operators_f9178880-ef43-43c5-8e91-f4c46d4aa0c6_0(36dfcfe9957fd558ad52b73e58a9dbaae3cc133107a0fee5d9974d8f24c9375f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.183384 4713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(0b4864a2a0ff631710058085d42f15ebe90ab38daa4419e0a4eb2dd6e8afde4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.183449 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(0b4864a2a0ff631710058085d42f15ebe90ab38daa4419e0a4eb2dd6e8afde4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.183472 4713 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(0b4864a2a0ff631710058085d42f15ebe90ab38daa4419e0a4eb2dd6e8afde4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:40:55 crc kubenswrapper[4713]: E0314 05:40:55.183515 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators(8f90d6ef-cc15-4d38-a2c5-f5e778500c73)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators(8f90d6ef-cc15-4d38-a2c5-f5e778500c73)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_openshift-operators_8f90d6ef-cc15-4d38-a2c5-f5e778500c73_0(0b4864a2a0ff631710058085d42f15ebe90ab38daa4419e0a4eb2dd6e8afde4b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" podUID="8f90d6ef-cc15-4d38-a2c5-f5e778500c73" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.215092 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.219804 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" podStartSLOduration=8.219781005 podStartE2EDuration="8.219781005s" podCreationTimestamp="2026-03-14 05:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:40:55.216105978 +0000 UTC m=+838.304015278" watchObservedRunningTime="2026-03-14 05:40:55.219781005 +0000 UTC m=+838.307690305" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.227911 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:40:55 crc kubenswrapper[4713]: I0314 05:40:55.334133 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 05:41:07 crc kubenswrapper[4713]: I0314 05:41:07.562717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:41:07 crc kubenswrapper[4713]: I0314 05:41:07.567622 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:41:07 crc kubenswrapper[4713]: I0314 05:41:07.816429 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hqw7p"] Mar 14 05:41:07 crc kubenswrapper[4713]: W0314 05:41:07.822556 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9178880_ef43_43c5_8e91_f4c46d4aa0c6.slice/crio-fd8944fa34dba5e86e9ad4d0abce8a2b01fd1ced349d3c2afaeb920696918734 WatchSource:0}: Error finding container fd8944fa34dba5e86e9ad4d0abce8a2b01fd1ced349d3c2afaeb920696918734: Status 404 returned error can't find the container with id fd8944fa34dba5e86e9ad4d0abce8a2b01fd1ced349d3c2afaeb920696918734 Mar 14 05:41:07 crc kubenswrapper[4713]: I0314 05:41:07.826892 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:41:08 crc kubenswrapper[4713]: I0314 05:41:08.232177 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" event={"ID":"f9178880-ef43-43c5-8e91-f4c46d4aa0c6","Type":"ContainerStarted","Data":"fd8944fa34dba5e86e9ad4d0abce8a2b01fd1ced349d3c2afaeb920696918734"} Mar 14 05:41:08 crc kubenswrapper[4713]: I0314 05:41:08.563103 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:41:08 crc kubenswrapper[4713]: I0314 05:41:08.563627 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" Mar 14 05:41:08 crc kubenswrapper[4713]: I0314 05:41:08.864501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl"] Mar 14 05:41:08 crc kubenswrapper[4713]: W0314 05:41:08.875761 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428d4860_9850_46cc_82c8_5fcf46c06748.slice/crio-4e79b7c53370743b8f9c13860a7d96871864a09e6b9eea5924cd8ebaa069fdf7 WatchSource:0}: Error finding container 4e79b7c53370743b8f9c13860a7d96871864a09e6b9eea5924cd8ebaa069fdf7: Status 404 returned error can't find the container with id 4e79b7c53370743b8f9c13860a7d96871864a09e6b9eea5924cd8ebaa069fdf7 Mar 14 05:41:09 crc kubenswrapper[4713]: I0314 05:41:09.241015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" event={"ID":"428d4860-9850-46cc-82c8-5fcf46c06748","Type":"ContainerStarted","Data":"4e79b7c53370743b8f9c13860a7d96871864a09e6b9eea5924cd8ebaa069fdf7"} Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.563308 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.563333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.563339 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.564942 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.564973 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.565270 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.732196 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:41:10 crc kubenswrapper[4713]: I0314 05:41:10.732657 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:41:11 crc kubenswrapper[4713]: I0314 05:41:11.808647 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl"] Mar 14 05:41:11 crc kubenswrapper[4713]: I0314 05:41:11.817792 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d"] Mar 14 05:41:11 crc kubenswrapper[4713]: I0314 05:41:11.832288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-wnccn"] Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.782478 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.784136 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.793442 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.859183 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4shk\" (UniqueName: \"kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.859298 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.859374 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.960552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4shk\" (UniqueName: \"kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.960620 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.960659 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.961152 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:12 crc kubenswrapper[4713]: I0314 05:41:12.962028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:13 crc kubenswrapper[4713]: I0314 05:41:13.010100 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4shk\" (UniqueName: \"kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk\") pod \"redhat-operators-rlcn9\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:13 crc kubenswrapper[4713]: I0314 05:41:13.175698 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:15 crc kubenswrapper[4713]: I0314 05:41:15.787693 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" event={"ID":"8f90d6ef-cc15-4d38-a2c5-f5e778500c73","Type":"ContainerStarted","Data":"8476d84589ddb94ee885d245acf8f57b2084cb8ba371318c7d09fa3303020010"} Mar 14 05:41:15 crc kubenswrapper[4713]: W0314 05:41:15.839749 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bce1139_ffd0_4518_8f2c_7ef46b2892e4.slice/crio-2e38b60224d62133518083af11f1a7b0ac3c121adcc64df388b08449ba0e9255 WatchSource:0}: Error finding container 2e38b60224d62133518083af11f1a7b0ac3c121adcc64df388b08449ba0e9255: Status 404 returned error can't find the container with id 2e38b60224d62133518083af11f1a7b0ac3c121adcc64df388b08449ba0e9255 Mar 14 05:41:15 crc kubenswrapper[4713]: W0314 05:41:15.840474 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94841b22_b2eb_4519_b04f_98010d848b46.slice/crio-51af602460a2a4afe49f62b5a73b307f9a0faec27ea4501c6d7a6da7d60714ec WatchSource:0}: Error finding container 51af602460a2a4afe49f62b5a73b307f9a0faec27ea4501c6d7a6da7d60714ec: Status 404 returned error can't find the container with id 51af602460a2a4afe49f62b5a73b307f9a0faec27ea4501c6d7a6da7d60714ec Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.354531 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.795119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" event={"ID":"9bce1139-ffd0-4518-8f2c-7ef46b2892e4","Type":"ContainerStarted","Data":"2e38b60224d62133518083af11f1a7b0ac3c121adcc64df388b08449ba0e9255"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.796996 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" event={"ID":"428d4860-9850-46cc-82c8-5fcf46c06748","Type":"ContainerStarted","Data":"6416e22b3de42134dc7e8a3089e7229bbf472e335657bf490ee0c4155faf2809"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.804051 4713 generic.go:334] "Generic (PLEG): container finished" podID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerID="f8e68c0c14d2b8e0a86acd185e251118f200ec9eb88c4838d73c8e781e854a43" exitCode=0 Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.804137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerDied","Data":"f8e68c0c14d2b8e0a86acd185e251118f200ec9eb88c4838d73c8e781e854a43"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.804179 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerStarted","Data":"1c46590f7a12b2954a493122c54bdfd58ec1b6efecaceee1703df1da0f2ecaa8"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.809281 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" event={"ID":"94841b22-b2eb-4519-b04f-98010d848b46","Type":"ContainerStarted","Data":"51af602460a2a4afe49f62b5a73b307f9a0faec27ea4501c6d7a6da7d60714ec"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.810827 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" event={"ID":"f9178880-ef43-43c5-8e91-f4c46d4aa0c6","Type":"ContainerStarted","Data":"c88694328cfe83a2e189d77e442fe62fd3f6386dae048ff8882603418e6668ea"} Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.810969 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.820050 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kdwsl" podStartSLOduration=18.729206291 podStartE2EDuration="25.820028615s" podCreationTimestamp="2026-03-14 05:40:51 +0000 UTC" firstStartedPulling="2026-03-14 05:41:08.877598605 +0000 UTC m=+851.965507905" lastFinishedPulling="2026-03-14 05:41:15.968420929 +0000 UTC m=+859.056330229" observedRunningTime="2026-03-14 05:41:16.816251084 +0000 UTC m=+859.904160404" watchObservedRunningTime="2026-03-14 05:41:16.820028615 +0000 UTC m=+859.907937935" Mar 14 05:41:16 crc kubenswrapper[4713]: I0314 05:41:16.842449 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podStartSLOduration=16.719574484 podStartE2EDuration="24.84242888s" podCreationTimestamp="2026-03-14 05:40:52 +0000 UTC" firstStartedPulling="2026-03-14 05:41:07.82664684 +0000 UTC m=+850.914556140" lastFinishedPulling="2026-03-14 05:41:15.949501236 +0000 UTC m=+859.037410536" observedRunningTime="2026-03-14 05:41:16.837891974 +0000 UTC m=+859.925801284" watchObservedRunningTime="2026-03-14 05:41:16.84242888 +0000 UTC m=+859.930338180" Mar 14 05:41:18 crc kubenswrapper[4713]: I0314 05:41:18.002075 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tb6w5" Mar 14 05:41:18 crc kubenswrapper[4713]: I0314 05:41:18.826119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" event={"ID":"9bce1139-ffd0-4518-8f2c-7ef46b2892e4","Type":"ContainerStarted","Data":"d590eaa4d87090bdec3c32989259f8b8f06492be6dd09249da796e2bdb8f6018"} Mar 14 05:41:18 crc kubenswrapper[4713]: I0314 05:41:18.830143 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" event={"ID":"8f90d6ef-cc15-4d38-a2c5-f5e778500c73","Type":"ContainerStarted","Data":"07aa73aae1604a870c0e8cd9a2f4c8d291ef5374e2f4ddb27c1ed88c75baa8ca"} Mar 14 05:41:18 crc kubenswrapper[4713]: I0314 05:41:18.844453 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl" podStartSLOduration=25.112820885 podStartE2EDuration="26.844426593s" podCreationTimestamp="2026-03-14 05:40:52 +0000 UTC" firstStartedPulling="2026-03-14 05:41:15.921937686 +0000 UTC m=+859.009846986" lastFinishedPulling="2026-03-14 05:41:17.653543394 +0000 UTC m=+860.741452694" observedRunningTime="2026-03-14 05:41:18.842903134 +0000 UTC m=+861.930812454" watchObservedRunningTime="2026-03-14 05:41:18.844426593 +0000 UTC m=+861.932335903" Mar 14 05:41:19 crc kubenswrapper[4713]: I0314 05:41:19.837896 4713 generic.go:334] "Generic (PLEG): container finished" podID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerID="5a1a56510bc13700bf2eba1196ff7841b708e79b3c53cbd1c8214808ff17c4a2" exitCode=0 Mar 14 05:41:19 crc kubenswrapper[4713]: I0314 05:41:19.837945 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerDied","Data":"5a1a56510bc13700bf2eba1196ff7841b708e79b3c53cbd1c8214808ff17c4a2"} Mar 14 05:41:19 crc kubenswrapper[4713]: I0314 05:41:19.858065 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d" podStartSLOduration=25.253255055 podStartE2EDuration="27.858046136s" podCreationTimestamp="2026-03-14 05:40:52 +0000 UTC" firstStartedPulling="2026-03-14 05:41:15.047607406 +0000 UTC m=+858.135516706" lastFinishedPulling="2026-03-14 05:41:17.652398497 +0000 UTC m=+860.740307787" observedRunningTime="2026-03-14 05:41:18.884474769 +0000 UTC m=+861.972384089" watchObservedRunningTime="2026-03-14 05:41:19.858046136 +0000 UTC m=+862.945955426" Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.852570 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerStarted","Data":"0cd870d3eb2ec39160112cb4bcabda3eccab29c91a08f8fbb79a865ca3c81910"} Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.854098 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" event={"ID":"94841b22-b2eb-4519-b04f-98010d848b46","Type":"ContainerStarted","Data":"4ca08f3979fd09d11d1ed07054d0b5cfc936128da8ca599fd63b34b17aa32d07"} Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.854338 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.887258 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rlcn9" podStartSLOduration=5.790559505 podStartE2EDuration="9.887242337s" podCreationTimestamp="2026-03-14 05:41:12 +0000 UTC" firstStartedPulling="2026-03-14 05:41:17.359496494 +0000 UTC m=+860.447405794" lastFinishedPulling="2026-03-14 05:41:21.456179326 +0000 UTC m=+864.544088626" observedRunningTime="2026-03-14 05:41:21.884665385 +0000 UTC m=+864.972574685" watchObservedRunningTime="2026-03-14 05:41:21.887242337 +0000 UTC m=+864.975151637" Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.914935 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podStartSLOduration=24.741657436 podStartE2EDuration="29.91491331s" podCreationTimestamp="2026-03-14 05:40:52 +0000 UTC" firstStartedPulling="2026-03-14 05:41:15.917163514 +0000 UTC m=+859.005072814" lastFinishedPulling="2026-03-14 05:41:21.090419388 +0000 UTC m=+864.178328688" observedRunningTime="2026-03-14 05:41:21.912769731 +0000 UTC m=+865.000679031" watchObservedRunningTime="2026-03-14 05:41:21.91491331 +0000 UTC m=+865.002822610" Mar 14 05:41:21 crc kubenswrapper[4713]: I0314 05:41:21.973093 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" Mar 14 05:41:22 crc kubenswrapper[4713]: I0314 05:41:22.804735 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 05:41:23 crc kubenswrapper[4713]: I0314 05:41:23.176929 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:23 crc kubenswrapper[4713]: I0314 05:41:23.177127 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:24 crc kubenswrapper[4713]: I0314 05:41:24.872247 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rlcn9" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="registry-server" probeResult="failure" output=< Mar 14 05:41:24 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:41:24 crc kubenswrapper[4713]: > Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.856858 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk"] Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.861561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.866873 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.867148 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zfmb9" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.869805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.910331 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ljr8h"] Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.912534 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ljr8h" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.933131 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6pnbs" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.934382 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk"] Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.953136 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xdfql"] Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.954469 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.956068 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dmmg4" Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.958010 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ljr8h"] Mar 14 05:41:28 crc kubenswrapper[4713]: I0314 05:41:28.962859 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xdfql"] Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.002967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmfw\" (UniqueName: \"kubernetes.io/projected/b0927073-aaac-4b3e-93e6-160c866785ad-kube-api-access-vwmfw\") pod \"cert-manager-cainjector-cf98fcc89-hjkpk\" (UID: \"b0927073-aaac-4b3e-93e6-160c866785ad\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.104981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmfw\" (UniqueName: \"kubernetes.io/projected/b0927073-aaac-4b3e-93e6-160c866785ad-kube-api-access-vwmfw\") pod \"cert-manager-cainjector-cf98fcc89-hjkpk\" (UID: \"b0927073-aaac-4b3e-93e6-160c866785ad\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.105059 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wp6x\" (UniqueName: \"kubernetes.io/projected/7df19eb2-dee5-4d0e-a141-fd7076e3b2a4-kube-api-access-2wp6x\") pod \"cert-manager-858654f9db-ljr8h\" (UID: \"7df19eb2-dee5-4d0e-a141-fd7076e3b2a4\") " pod="cert-manager/cert-manager-858654f9db-ljr8h" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.105110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6wt\" (UniqueName: \"kubernetes.io/projected/6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c-kube-api-access-xt6wt\") pod \"cert-manager-webhook-687f57d79b-xdfql\" (UID: \"6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.152176 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmfw\" (UniqueName: \"kubernetes.io/projected/b0927073-aaac-4b3e-93e6-160c866785ad-kube-api-access-vwmfw\") pod \"cert-manager-cainjector-cf98fcc89-hjkpk\" (UID: \"b0927073-aaac-4b3e-93e6-160c866785ad\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.191922 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.205654 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wp6x\" (UniqueName: \"kubernetes.io/projected/7df19eb2-dee5-4d0e-a141-fd7076e3b2a4-kube-api-access-2wp6x\") pod \"cert-manager-858654f9db-ljr8h\" (UID: \"7df19eb2-dee5-4d0e-a141-fd7076e3b2a4\") " pod="cert-manager/cert-manager-858654f9db-ljr8h" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.205742 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6wt\" (UniqueName: \"kubernetes.io/projected/6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c-kube-api-access-xt6wt\") pod \"cert-manager-webhook-687f57d79b-xdfql\" (UID: \"6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.223307 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6wt\" (UniqueName: \"kubernetes.io/projected/6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c-kube-api-access-xt6wt\") pod \"cert-manager-webhook-687f57d79b-xdfql\" (UID: \"6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.223960 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wp6x\" (UniqueName: \"kubernetes.io/projected/7df19eb2-dee5-4d0e-a141-fd7076e3b2a4-kube-api-access-2wp6x\") pod \"cert-manager-858654f9db-ljr8h\" (UID: \"7df19eb2-dee5-4d0e-a141-fd7076e3b2a4\") " pod="cert-manager/cert-manager-858654f9db-ljr8h" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.252366 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ljr8h" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.272000 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.482218 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk"] Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.543369 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ljr8h"] Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.634864 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xdfql"] Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.947514 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" event={"ID":"b0927073-aaac-4b3e-93e6-160c866785ad","Type":"ContainerStarted","Data":"f13d2eec916eff5d9bc7be494636bd07ebb41be02ae255ce373488627a51579e"} Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.949356 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" event={"ID":"6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c","Type":"ContainerStarted","Data":"fdb5fc94829492ec05b969af0a793193be714059befb661d142c2fe1bf4bb21d"} Mar 14 05:41:29 crc kubenswrapper[4713]: I0314 05:41:29.950471 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ljr8h" event={"ID":"7df19eb2-dee5-4d0e-a141-fd7076e3b2a4","Type":"ContainerStarted","Data":"2d8d376a38edba790bcffe93396fe70f61e0cfbc65d1c6e4f43c13b876703310"} Mar 14 05:41:33 crc kubenswrapper[4713]: I0314 05:41:33.221186 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:33 crc kubenswrapper[4713]: I0314 05:41:33.283776 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:33 crc kubenswrapper[4713]: I0314 05:41:33.454159 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:33 crc kubenswrapper[4713]: I0314 05:41:33.984969 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" event={"ID":"b0927073-aaac-4b3e-93e6-160c866785ad","Type":"ContainerStarted","Data":"1f49c718fe4b02dfea5226d089a621e8a0fdfcefabeada9bd495308bc1c22c24"} Mar 14 05:41:33 crc kubenswrapper[4713]: I0314 05:41:33.987892 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ljr8h" event={"ID":"7df19eb2-dee5-4d0e-a141-fd7076e3b2a4","Type":"ContainerStarted","Data":"30bef3d7ee4a4ae54a02746346092ca3ea0968a1e83872f314a7e552c1d36130"} Mar 14 05:41:34 crc kubenswrapper[4713]: I0314 05:41:34.009600 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hjkpk" podStartSLOduration=2.5475718560000002 podStartE2EDuration="6.009573172s" podCreationTimestamp="2026-03-14 05:41:28 +0000 UTC" firstStartedPulling="2026-03-14 05:41:29.507180368 +0000 UTC m=+872.595089658" lastFinishedPulling="2026-03-14 05:41:32.969181674 +0000 UTC m=+876.057090974" observedRunningTime="2026-03-14 05:41:33.996993251 +0000 UTC m=+877.084902561" watchObservedRunningTime="2026-03-14 05:41:34.009573172 +0000 UTC m=+877.097482482" Mar 14 05:41:34 crc kubenswrapper[4713]: I0314 05:41:34.028463 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ljr8h" podStartSLOduration=2.617519706 podStartE2EDuration="6.028443704s" podCreationTimestamp="2026-03-14 05:41:28 +0000 UTC" firstStartedPulling="2026-03-14 05:41:29.559152375 +0000 UTC m=+872.647061675" lastFinishedPulling="2026-03-14 05:41:32.970076373 +0000 UTC m=+876.057985673" observedRunningTime="2026-03-14 05:41:34.021476401 +0000 UTC m=+877.109385701" watchObservedRunningTime="2026-03-14 05:41:34.028443704 +0000 UTC m=+877.116353004" Mar 14 05:41:34 crc kubenswrapper[4713]: I0314 05:41:34.995862 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rlcn9" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="registry-server" containerID="cri-o://0cd870d3eb2ec39160112cb4bcabda3eccab29c91a08f8fbb79a865ca3c81910" gracePeriod=2 Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.004158 4713 generic.go:334] "Generic (PLEG): container finished" podID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerID="0cd870d3eb2ec39160112cb4bcabda3eccab29c91a08f8fbb79a865ca3c81910" exitCode=0 Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.004248 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerDied","Data":"0cd870d3eb2ec39160112cb4bcabda3eccab29c91a08f8fbb79a865ca3c81910"} Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.385258 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.437449 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4shk\" (UniqueName: \"kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk\") pod \"2b67dacc-e93d-435c-b930-1731cef0fdaf\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.437520 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content\") pod \"2b67dacc-e93d-435c-b930-1731cef0fdaf\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.437581 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities\") pod \"2b67dacc-e93d-435c-b930-1731cef0fdaf\" (UID: \"2b67dacc-e93d-435c-b930-1731cef0fdaf\") " Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.439005 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities" (OuterVolumeSpecName: "utilities") pod "2b67dacc-e93d-435c-b930-1731cef0fdaf" (UID: "2b67dacc-e93d-435c-b930-1731cef0fdaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.444039 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk" (OuterVolumeSpecName: "kube-api-access-v4shk") pod "2b67dacc-e93d-435c-b930-1731cef0fdaf" (UID: "2b67dacc-e93d-435c-b930-1731cef0fdaf"). InnerVolumeSpecName "kube-api-access-v4shk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.539741 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.539785 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4shk\" (UniqueName: \"kubernetes.io/projected/2b67dacc-e93d-435c-b930-1731cef0fdaf-kube-api-access-v4shk\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.565235 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b67dacc-e93d-435c-b930-1731cef0fdaf" (UID: "2b67dacc-e93d-435c-b930-1731cef0fdaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:41:36 crc kubenswrapper[4713]: I0314 05:41:36.640518 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b67dacc-e93d-435c-b930-1731cef0fdaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.210757 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" event={"ID":"6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c","Type":"ContainerStarted","Data":"3f9945fd9bed656b1e2d8ae66d538c9b1598d8d5ac494c6ab958e36361508e7d"} Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.210967 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.213084 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlcn9" event={"ID":"2b67dacc-e93d-435c-b930-1731cef0fdaf","Type":"ContainerDied","Data":"1c46590f7a12b2954a493122c54bdfd58ec1b6efecaceee1703df1da0f2ecaa8"} Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.213118 4713 scope.go:117] "RemoveContainer" containerID="0cd870d3eb2ec39160112cb4bcabda3eccab29c91a08f8fbb79a865ca3c81910" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.213551 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlcn9" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.230537 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" podStartSLOduration=2.590376321 podStartE2EDuration="9.230518629s" podCreationTimestamp="2026-03-14 05:41:28 +0000 UTC" firstStartedPulling="2026-03-14 05:41:29.643404913 +0000 UTC m=+872.731314213" lastFinishedPulling="2026-03-14 05:41:36.283547221 +0000 UTC m=+879.371456521" observedRunningTime="2026-03-14 05:41:37.227573605 +0000 UTC m=+880.315482905" watchObservedRunningTime="2026-03-14 05:41:37.230518629 +0000 UTC m=+880.318427929" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.232413 4713 scope.go:117] "RemoveContainer" containerID="5a1a56510bc13700bf2eba1196ff7841b708e79b3c53cbd1c8214808ff17c4a2" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.247795 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.252262 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rlcn9"] Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.264938 4713 scope.go:117] "RemoveContainer" containerID="f8e68c0c14d2b8e0a86acd185e251118f200ec9eb88c4838d73c8e781e854a43" Mar 14 05:41:37 crc kubenswrapper[4713]: I0314 05:41:37.572816 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" path="/var/lib/kubelet/pods/2b67dacc-e93d-435c-b930-1731cef0fdaf/volumes" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.674847 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:39 crc kubenswrapper[4713]: E0314 05:41:39.676932 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="registry-server" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.676964 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="registry-server" Mar 14 05:41:39 crc kubenswrapper[4713]: E0314 05:41:39.677006 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="extract-utilities" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.677016 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="extract-utilities" Mar 14 05:41:39 crc kubenswrapper[4713]: E0314 05:41:39.677031 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="extract-content" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.677040 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="extract-content" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.677835 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b67dacc-e93d-435c-b930-1731cef0fdaf" containerName="registry-server" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.680809 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.684377 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.785276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.785620 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.785771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztz8m\" (UniqueName: \"kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.886985 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.889405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztz8m\" (UniqueName: \"kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.889565 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.889753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.890057 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:39 crc kubenswrapper[4713]: I0314 05:41:39.910403 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztz8m\" (UniqueName: \"kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m\") pod \"redhat-marketplace-kxqxq\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:40 crc kubenswrapper[4713]: I0314 05:41:40.001148 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:40 crc kubenswrapper[4713]: I0314 05:41:40.456994 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:40 crc kubenswrapper[4713]: I0314 05:41:40.731669 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:41:40 crc kubenswrapper[4713]: I0314 05:41:40.731747 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:41:41 crc kubenswrapper[4713]: I0314 05:41:41.245348 4713 generic.go:334] "Generic (PLEG): container finished" podID="12683b3b-5683-4013-a2e8-c736bf07c669" containerID="7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f" exitCode=0 Mar 14 05:41:41 crc kubenswrapper[4713]: I0314 05:41:41.245400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerDied","Data":"7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f"} Mar 14 05:41:41 crc kubenswrapper[4713]: I0314 05:41:41.245883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerStarted","Data":"f6db250a6bfc4541097c077a360bb45e1543bbaef49abcc9c161742f343f9729"} Mar 14 05:41:43 crc kubenswrapper[4713]: I0314 05:41:43.261246 4713 generic.go:334] "Generic (PLEG): container finished" podID="12683b3b-5683-4013-a2e8-c736bf07c669" containerID="2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55" exitCode=0 Mar 14 05:41:43 crc kubenswrapper[4713]: I0314 05:41:43.261336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerDied","Data":"2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55"} Mar 14 05:41:44 crc kubenswrapper[4713]: I0314 05:41:44.275978 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" Mar 14 05:41:44 crc kubenswrapper[4713]: I0314 05:41:44.281090 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerStarted","Data":"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290"} Mar 14 05:41:44 crc kubenswrapper[4713]: I0314 05:41:44.331445 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kxqxq" podStartSLOduration=2.9108325 podStartE2EDuration="5.331419185s" podCreationTimestamp="2026-03-14 05:41:39 +0000 UTC" firstStartedPulling="2026-03-14 05:41:41.248698257 +0000 UTC m=+884.336607597" lastFinishedPulling="2026-03-14 05:41:43.669284962 +0000 UTC m=+886.757194282" observedRunningTime="2026-03-14 05:41:44.326868189 +0000 UTC m=+887.414777489" watchObservedRunningTime="2026-03-14 05:41:44.331419185 +0000 UTC m=+887.419328485" Mar 14 05:41:50 crc kubenswrapper[4713]: I0314 05:41:50.001841 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:50 crc kubenswrapper[4713]: I0314 05:41:50.003313 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:50 crc kubenswrapper[4713]: I0314 05:41:50.040983 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:50 crc kubenswrapper[4713]: I0314 05:41:50.361742 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:50 crc kubenswrapper[4713]: I0314 05:41:50.401655 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.339010 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kxqxq" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="registry-server" containerID="cri-o://bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290" gracePeriod=2 Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.780495 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.868442 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities\") pod \"12683b3b-5683-4013-a2e8-c736bf07c669\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.868491 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content\") pod \"12683b3b-5683-4013-a2e8-c736bf07c669\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.868550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztz8m\" (UniqueName: \"kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m\") pod \"12683b3b-5683-4013-a2e8-c736bf07c669\" (UID: \"12683b3b-5683-4013-a2e8-c736bf07c669\") " Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.869425 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities" (OuterVolumeSpecName: "utilities") pod "12683b3b-5683-4013-a2e8-c736bf07c669" (UID: "12683b3b-5683-4013-a2e8-c736bf07c669"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.877825 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m" (OuterVolumeSpecName: "kube-api-access-ztz8m") pod "12683b3b-5683-4013-a2e8-c736bf07c669" (UID: "12683b3b-5683-4013-a2e8-c736bf07c669"). InnerVolumeSpecName "kube-api-access-ztz8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.894889 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12683b3b-5683-4013-a2e8-c736bf07c669" (UID: "12683b3b-5683-4013-a2e8-c736bf07c669"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.969439 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.969476 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12683b3b-5683-4013-a2e8-c736bf07c669-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:52 crc kubenswrapper[4713]: I0314 05:41:52.969487 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztz8m\" (UniqueName: \"kubernetes.io/projected/12683b3b-5683-4013-a2e8-c736bf07c669-kube-api-access-ztz8m\") on node \"crc\" DevicePath \"\"" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.348857 4713 generic.go:334] "Generic (PLEG): container finished" podID="12683b3b-5683-4013-a2e8-c736bf07c669" containerID="bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290" exitCode=0 Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.348911 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerDied","Data":"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290"} Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.348975 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxqxq" event={"ID":"12683b3b-5683-4013-a2e8-c736bf07c669","Type":"ContainerDied","Data":"f6db250a6bfc4541097c077a360bb45e1543bbaef49abcc9c161742f343f9729"} Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.349008 4713 scope.go:117] "RemoveContainer" containerID="bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.348935 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxqxq" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.366922 4713 scope.go:117] "RemoveContainer" containerID="2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.386765 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.392151 4713 scope.go:117] "RemoveContainer" containerID="7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.392244 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxqxq"] Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.406051 4713 scope.go:117] "RemoveContainer" containerID="bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290" Mar 14 05:41:53 crc kubenswrapper[4713]: E0314 05:41:53.406496 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290\": container with ID starting with bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290 not found: ID does not exist" containerID="bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.406540 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290"} err="failed to get container status \"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290\": rpc error: code = NotFound desc = could not find container \"bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290\": container with ID starting with bec28110ea945a548d7bc0a76de57011ab6969cb9901ff9e957bf47f2dd71290 not found: ID does not exist" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.406565 4713 scope.go:117] "RemoveContainer" containerID="2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55" Mar 14 05:41:53 crc kubenswrapper[4713]: E0314 05:41:53.406938 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55\": container with ID starting with 2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55 not found: ID does not exist" containerID="2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.406983 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55"} err="failed to get container status \"2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55\": rpc error: code = NotFound desc = could not find container \"2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55\": container with ID starting with 2abf23f584b7698123f91cb97aa9b70e569739c38b8f05326edcc15f44e09e55 not found: ID does not exist" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.407012 4713 scope.go:117] "RemoveContainer" containerID="7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f" Mar 14 05:41:53 crc kubenswrapper[4713]: E0314 05:41:53.407403 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f\": container with ID starting with 7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f not found: ID does not exist" containerID="7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.407428 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f"} err="failed to get container status \"7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f\": rpc error: code = NotFound desc = could not find container \"7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f\": container with ID starting with 7c8556241d8ad2c8f30a985ac4a9bca7f7d6aabdf7f44c07f87b8377fd9ae38f not found: ID does not exist" Mar 14 05:41:53 crc kubenswrapper[4713]: I0314 05:41:53.576607 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" path="/var/lib/kubelet/pods/12683b3b-5683-4013-a2e8-c736bf07c669/volumes" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.136187 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557782-dpssc"] Mar 14 05:42:00 crc kubenswrapper[4713]: E0314 05:42:00.136995 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="extract-content" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.137007 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="extract-content" Mar 14 05:42:00 crc kubenswrapper[4713]: E0314 05:42:00.137023 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="registry-server" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.137029 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="registry-server" Mar 14 05:42:00 crc kubenswrapper[4713]: E0314 05:42:00.137040 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="extract-utilities" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.137048 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="extract-utilities" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.137153 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="12683b3b-5683-4013-a2e8-c736bf07c669" containerName="registry-server" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.137687 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.140514 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.140530 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.140742 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.142511 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557782-dpssc"] Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.222795 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdzj\" (UniqueName: \"kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj\") pod \"auto-csr-approver-29557782-dpssc\" (UID: \"32a52a89-d171-46d2-9a6d-6263fb859454\") " pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.323997 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdzj\" (UniqueName: \"kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj\") pod \"auto-csr-approver-29557782-dpssc\" (UID: \"32a52a89-d171-46d2-9a6d-6263fb859454\") " pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.346082 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdzj\" (UniqueName: \"kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj\") pod \"auto-csr-approver-29557782-dpssc\" (UID: \"32a52a89-d171-46d2-9a6d-6263fb859454\") " pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.470081 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:00 crc kubenswrapper[4713]: I0314 05:42:00.855487 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557782-dpssc"] Mar 14 05:42:00 crc kubenswrapper[4713]: W0314 05:42:00.861422 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a52a89_d171_46d2_9a6d_6263fb859454.slice/crio-0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc WatchSource:0}: Error finding container 0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc: Status 404 returned error can't find the container with id 0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc Mar 14 05:42:01 crc kubenswrapper[4713]: I0314 05:42:01.401705 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557782-dpssc" event={"ID":"32a52a89-d171-46d2-9a6d-6263fb859454","Type":"ContainerStarted","Data":"0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc"} Mar 14 05:42:02 crc kubenswrapper[4713]: I0314 05:42:02.410090 4713 generic.go:334] "Generic (PLEG): container finished" podID="32a52a89-d171-46d2-9a6d-6263fb859454" containerID="7df476fc18b38503e61dd067ffe79ff0d7f2f7df934b3a84918ed470e578f06e" exitCode=0 Mar 14 05:42:02 crc kubenswrapper[4713]: I0314 05:42:02.410133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557782-dpssc" event={"ID":"32a52a89-d171-46d2-9a6d-6263fb859454","Type":"ContainerDied","Data":"7df476fc18b38503e61dd067ffe79ff0d7f2f7df934b3a84918ed470e578f06e"} Mar 14 05:42:03 crc kubenswrapper[4713]: I0314 05:42:03.679714 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:03 crc kubenswrapper[4713]: I0314 05:42:03.877859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdzj\" (UniqueName: \"kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj\") pod \"32a52a89-d171-46d2-9a6d-6263fb859454\" (UID: \"32a52a89-d171-46d2-9a6d-6263fb859454\") " Mar 14 05:42:03 crc kubenswrapper[4713]: I0314 05:42:03.883683 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj" (OuterVolumeSpecName: "kube-api-access-pwdzj") pod "32a52a89-d171-46d2-9a6d-6263fb859454" (UID: "32a52a89-d171-46d2-9a6d-6263fb859454"). InnerVolumeSpecName "kube-api-access-pwdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:42:03 crc kubenswrapper[4713]: I0314 05:42:03.979425 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdzj\" (UniqueName: \"kubernetes.io/projected/32a52a89-d171-46d2-9a6d-6263fb859454-kube-api-access-pwdzj\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:04 crc kubenswrapper[4713]: I0314 05:42:04.425300 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557782-dpssc" event={"ID":"32a52a89-d171-46d2-9a6d-6263fb859454","Type":"ContainerDied","Data":"0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc"} Mar 14 05:42:04 crc kubenswrapper[4713]: I0314 05:42:04.425354 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557782-dpssc" Mar 14 05:42:04 crc kubenswrapper[4713]: I0314 05:42:04.425360 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9afb9c2b323d33ff4b97054a4eb08602bfab4674a68c9bdfcc68353ceb3adc" Mar 14 05:42:04 crc kubenswrapper[4713]: I0314 05:42:04.736404 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557776-fd952"] Mar 14 05:42:04 crc kubenswrapper[4713]: I0314 05:42:04.743072 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557776-fd952"] Mar 14 05:42:05 crc kubenswrapper[4713]: I0314 05:42:05.583025 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e5d46f-370a-410c-a86e-e0ae032a3b35" path="/var/lib/kubelet/pods/a5e5d46f-370a-410c-a86e-e0ae032a3b35/volumes" Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.453491 4713 scope.go:117] "RemoveContainer" containerID="b4cca9795aafe53ce4a49800ec57f6107a322b3a1b2af1bdab1be40845a65cc2" Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.731450 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.731523 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.731565 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.732190 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:42:10 crc kubenswrapper[4713]: I0314 05:42:10.732572 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea" gracePeriod=600 Mar 14 05:42:11 crc kubenswrapper[4713]: I0314 05:42:11.473325 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea" exitCode=0 Mar 14 05:42:11 crc kubenswrapper[4713]: I0314 05:42:11.473524 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea"} Mar 14 05:42:11 crc kubenswrapper[4713]: I0314 05:42:11.473823 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1"} Mar 14 05:42:11 crc kubenswrapper[4713]: I0314 05:42:11.473842 4713 scope.go:117] "RemoveContainer" containerID="b547c1a6b15d35c71b1fe36925c299d8cf39995de4a55026e9398295e2918673" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.113642 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww"] Mar 14 05:42:12 crc kubenswrapper[4713]: E0314 05:42:12.114220 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a52a89-d171-46d2-9a6d-6263fb859454" containerName="oc" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.114322 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a52a89-d171-46d2-9a6d-6263fb859454" containerName="oc" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.114577 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a52a89-d171-46d2-9a6d-6263fb859454" containerName="oc" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.115739 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.119870 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.123921 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww"] Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.205594 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.205671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx9f6\" (UniqueName: \"kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.205702 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.307062 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.307161 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx9f6\" (UniqueName: \"kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.307195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.307708 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.307764 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.329443 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx9f6\" (UniqueName: \"kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.432160 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:12 crc kubenswrapper[4713]: I0314 05:42:12.676475 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww"] Mar 14 05:42:13 crc kubenswrapper[4713]: I0314 05:42:13.490532 4713 generic.go:334] "Generic (PLEG): container finished" podID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerID="eacd2f5865cfdaa0a956e4d52f92ebe86a63f3f3183fa1777eb49afd5e6a74ad" exitCode=0 Mar 14 05:42:13 crc kubenswrapper[4713]: I0314 05:42:13.490660 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" event={"ID":"aff57dd5-95fe-4005-bdaf-20d8516df1d4","Type":"ContainerDied","Data":"eacd2f5865cfdaa0a956e4d52f92ebe86a63f3f3183fa1777eb49afd5e6a74ad"} Mar 14 05:42:13 crc kubenswrapper[4713]: I0314 05:42:13.491047 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" event={"ID":"aff57dd5-95fe-4005-bdaf-20d8516df1d4","Type":"ContainerStarted","Data":"a1dcbbff6f051486ceda52ced80fee3cfa7a8ff6b8c59c95b89b3d2fe6cc3b6f"} Mar 14 05:42:15 crc kubenswrapper[4713]: I0314 05:42:15.504715 4713 generic.go:334] "Generic (PLEG): container finished" podID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerID="a75bc4e42e29ce9ffb84c8e64f201fe1ae70c382dec87ccfeb45442d22644179" exitCode=0 Mar 14 05:42:15 crc kubenswrapper[4713]: I0314 05:42:15.504775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" event={"ID":"aff57dd5-95fe-4005-bdaf-20d8516df1d4","Type":"ContainerDied","Data":"a75bc4e42e29ce9ffb84c8e64f201fe1ae70c382dec87ccfeb45442d22644179"} Mar 14 05:42:16 crc kubenswrapper[4713]: I0314 05:42:16.513184 4713 generic.go:334] "Generic (PLEG): container finished" podID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerID="3a9dd2dc9d8e8f15e81fff3bb1e324728ecd412da4cd1f0852ac8ca82e0b9ad2" exitCode=0 Mar 14 05:42:16 crc kubenswrapper[4713]: I0314 05:42:16.513243 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" event={"ID":"aff57dd5-95fe-4005-bdaf-20d8516df1d4","Type":"ContainerDied","Data":"3a9dd2dc9d8e8f15e81fff3bb1e324728ecd412da4cd1f0852ac8ca82e0b9ad2"} Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.758353 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.872644 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util\") pod \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.872726 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle\") pod \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.872805 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx9f6\" (UniqueName: \"kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6\") pod \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\" (UID: \"aff57dd5-95fe-4005-bdaf-20d8516df1d4\") " Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.873885 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle" (OuterVolumeSpecName: "bundle") pod "aff57dd5-95fe-4005-bdaf-20d8516df1d4" (UID: "aff57dd5-95fe-4005-bdaf-20d8516df1d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.878507 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6" (OuterVolumeSpecName: "kube-api-access-hx9f6") pod "aff57dd5-95fe-4005-bdaf-20d8516df1d4" (UID: "aff57dd5-95fe-4005-bdaf-20d8516df1d4"). InnerVolumeSpecName "kube-api-access-hx9f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.889446 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util" (OuterVolumeSpecName: "util") pod "aff57dd5-95fe-4005-bdaf-20d8516df1d4" (UID: "aff57dd5-95fe-4005-bdaf-20d8516df1d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.974012 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.974284 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aff57dd5-95fe-4005-bdaf-20d8516df1d4-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:17 crc kubenswrapper[4713]: I0314 05:42:17.974384 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx9f6\" (UniqueName: \"kubernetes.io/projected/aff57dd5-95fe-4005-bdaf-20d8516df1d4-kube-api-access-hx9f6\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:18 crc kubenswrapper[4713]: I0314 05:42:18.528359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" event={"ID":"aff57dd5-95fe-4005-bdaf-20d8516df1d4","Type":"ContainerDied","Data":"a1dcbbff6f051486ceda52ced80fee3cfa7a8ff6b8c59c95b89b3d2fe6cc3b6f"} Mar 14 05:42:18 crc kubenswrapper[4713]: I0314 05:42:18.528685 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1dcbbff6f051486ceda52ced80fee3cfa7a8ff6b8c59c95b89b3d2fe6cc3b6f" Mar 14 05:42:18 crc kubenswrapper[4713]: I0314 05:42:18.528398 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.907182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4"] Mar 14 05:42:19 crc kubenswrapper[4713]: E0314 05:42:19.907861 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="util" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.907877 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="util" Mar 14 05:42:19 crc kubenswrapper[4713]: E0314 05:42:19.907895 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="extract" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.907903 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="extract" Mar 14 05:42:19 crc kubenswrapper[4713]: E0314 05:42:19.907914 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="pull" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.907922 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="pull" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.908078 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff57dd5-95fe-4005-bdaf-20d8516df1d4" containerName="extract" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.909163 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.919372 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 05:42:19 crc kubenswrapper[4713]: I0314 05:42:19.929645 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4"] Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.005635 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr7b\" (UniqueName: \"kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.005716 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.005768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.106677 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr7b\" (UniqueName: \"kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.106779 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.106815 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.107294 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.107509 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.130418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr7b\" (UniqueName: \"kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.235280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.464389 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4"] Mar 14 05:42:20 crc kubenswrapper[4713]: I0314 05:42:20.543917 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" event={"ID":"6b454723-8a49-4be7-87fd-ca93753c8c91","Type":"ContainerStarted","Data":"fd240efe029a5dfb6e4feb3e0b5b66b6278aa02bd6e33b82325a144880da41a3"} Mar 14 05:42:21 crc kubenswrapper[4713]: I0314 05:42:21.551841 4713 generic.go:334] "Generic (PLEG): container finished" podID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerID="7bc72fb2ea4e466ac22c58b6ccf9254e3a13036c6c8229772477eef7d1406ce9" exitCode=0 Mar 14 05:42:21 crc kubenswrapper[4713]: I0314 05:42:21.551902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" event={"ID":"6b454723-8a49-4be7-87fd-ca93753c8c91","Type":"ContainerDied","Data":"7bc72fb2ea4e466ac22c58b6ccf9254e3a13036c6c8229772477eef7d1406ce9"} Mar 14 05:42:23 crc kubenswrapper[4713]: I0314 05:42:23.565385 4713 generic.go:334] "Generic (PLEG): container finished" podID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerID="dbcf1dced699322c0d82ddcaa681d2a7f08e834b15ed7261a75ba03663c2a04d" exitCode=0 Mar 14 05:42:23 crc kubenswrapper[4713]: I0314 05:42:23.575007 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" event={"ID":"6b454723-8a49-4be7-87fd-ca93753c8c91","Type":"ContainerDied","Data":"dbcf1dced699322c0d82ddcaa681d2a7f08e834b15ed7261a75ba03663c2a04d"} Mar 14 05:42:24 crc kubenswrapper[4713]: I0314 05:42:24.573926 4713 generic.go:334] "Generic (PLEG): container finished" podID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerID="5b93547f08355408f3e2fd69a395b07b95525a1f2f80c805813f16b4c291167b" exitCode=0 Mar 14 05:42:24 crc kubenswrapper[4713]: I0314 05:42:24.574001 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" event={"ID":"6b454723-8a49-4be7-87fd-ca93753c8c91","Type":"ContainerDied","Data":"5b93547f08355408f3e2fd69a395b07b95525a1f2f80c805813f16b4c291167b"} Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.849677 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.987242 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle\") pod \"6b454723-8a49-4be7-87fd-ca93753c8c91\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.987344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr7b\" (UniqueName: \"kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b\") pod \"6b454723-8a49-4be7-87fd-ca93753c8c91\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.987446 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util\") pod \"6b454723-8a49-4be7-87fd-ca93753c8c91\" (UID: \"6b454723-8a49-4be7-87fd-ca93753c8c91\") " Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.988572 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle" (OuterVolumeSpecName: "bundle") pod "6b454723-8a49-4be7-87fd-ca93753c8c91" (UID: "6b454723-8a49-4be7-87fd-ca93753c8c91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:42:25 crc kubenswrapper[4713]: I0314 05:42:25.997346 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b" (OuterVolumeSpecName: "kube-api-access-fcr7b") pod "6b454723-8a49-4be7-87fd-ca93753c8c91" (UID: "6b454723-8a49-4be7-87fd-ca93753c8c91"). InnerVolumeSpecName "kube-api-access-fcr7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.088880 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.088921 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr7b\" (UniqueName: \"kubernetes.io/projected/6b454723-8a49-4be7-87fd-ca93753c8c91-kube-api-access-fcr7b\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.101836 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p"] Mar 14 05:42:26 crc kubenswrapper[4713]: E0314 05:42:26.102108 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="extract" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.102119 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="extract" Mar 14 05:42:26 crc kubenswrapper[4713]: E0314 05:42:26.102136 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="util" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.102143 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="util" Mar 14 05:42:26 crc kubenswrapper[4713]: E0314 05:42:26.102151 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="pull" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.102157 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="pull" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.102289 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b454723-8a49-4be7-87fd-ca93753c8c91" containerName="extract" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.102915 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.105558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.106661 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.107422 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-s92mq" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.107798 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.107870 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.108607 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.128327 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p"] Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.190014 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-apiservice-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.190087 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqfwt\" (UniqueName: \"kubernetes.io/projected/aec4bfff-0bef-401b-9db6-f9046825614a-kube-api-access-qqfwt\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.190120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-webhook-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.190144 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.190174 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aec4bfff-0bef-401b-9db6-f9046825614a-manager-config\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.221572 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util" (OuterVolumeSpecName: "util") pod "6b454723-8a49-4be7-87fd-ca93753c8c91" (UID: "6b454723-8a49-4be7-87fd-ca93753c8c91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.291831 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aec4bfff-0bef-401b-9db6-f9046825614a-manager-config\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.291921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-apiservice-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.291963 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqfwt\" (UniqueName: \"kubernetes.io/projected/aec4bfff-0bef-401b-9db6-f9046825614a-kube-api-access-qqfwt\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.291993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-webhook-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.292013 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.292063 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b454723-8a49-4be7-87fd-ca93753c8c91-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.292848 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aec4bfff-0bef-401b-9db6-f9046825614a-manager-config\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.296400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-webhook-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.297399 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.300805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec4bfff-0bef-401b-9db6-f9046825614a-apiservice-cert\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.314279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqfwt\" (UniqueName: \"kubernetes.io/projected/aec4bfff-0bef-401b-9db6-f9046825614a-kube-api-access-qqfwt\") pod \"loki-operator-controller-manager-66945dfc9f-xqf5p\" (UID: \"aec4bfff-0bef-401b-9db6-f9046825614a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.418504 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.591074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" event={"ID":"6b454723-8a49-4be7-87fd-ca93753c8c91","Type":"ContainerDied","Data":"fd240efe029a5dfb6e4feb3e0b5b66b6278aa02bd6e33b82325a144880da41a3"} Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.591127 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd240efe029a5dfb6e4feb3e0b5b66b6278aa02bd6e33b82325a144880da41a3" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.591195 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4" Mar 14 05:42:26 crc kubenswrapper[4713]: I0314 05:42:26.670482 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p"] Mar 14 05:42:26 crc kubenswrapper[4713]: W0314 05:42:26.677471 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaec4bfff_0bef_401b_9db6_f9046825614a.slice/crio-e20044d1ea44d79af43a56dfad47967e685d39c31e6d3200bfc72f23d35f18a6 WatchSource:0}: Error finding container e20044d1ea44d79af43a56dfad47967e685d39c31e6d3200bfc72f23d35f18a6: Status 404 returned error can't find the container with id e20044d1ea44d79af43a56dfad47967e685d39c31e6d3200bfc72f23d35f18a6 Mar 14 05:42:27 crc kubenswrapper[4713]: I0314 05:42:27.601320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" event={"ID":"aec4bfff-0bef-401b-9db6-f9046825614a","Type":"ContainerStarted","Data":"e20044d1ea44d79af43a56dfad47967e685d39c31e6d3200bfc72f23d35f18a6"} Mar 14 05:42:32 crc kubenswrapper[4713]: I0314 05:42:32.639905 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" event={"ID":"aec4bfff-0bef-401b-9db6-f9046825614a","Type":"ContainerStarted","Data":"113cf48a36559b3eeb4adaa85c82eeaf6ad7105fcea8f89abdcd37bf151d22e6"} Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.905494 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7"] Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.907127 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.909226 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.909353 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.911077 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-dx5wt" Mar 14 05:42:37 crc kubenswrapper[4713]: I0314 05:42:37.926735 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7"] Mar 14 05:42:38 crc kubenswrapper[4713]: I0314 05:42:38.003962 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkqs\" (UniqueName: \"kubernetes.io/projected/6b3214ee-eee2-4b89-8491-2255fd1068be-kube-api-access-5tkqs\") pod \"cluster-logging-operator-66689c4bbf-f7fr7\" (UID: \"6b3214ee-eee2-4b89-8491-2255fd1068be\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" Mar 14 05:42:38 crc kubenswrapper[4713]: I0314 05:42:38.105647 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkqs\" (UniqueName: \"kubernetes.io/projected/6b3214ee-eee2-4b89-8491-2255fd1068be-kube-api-access-5tkqs\") pod \"cluster-logging-operator-66689c4bbf-f7fr7\" (UID: \"6b3214ee-eee2-4b89-8491-2255fd1068be\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" Mar 14 05:42:38 crc kubenswrapper[4713]: I0314 05:42:38.129920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkqs\" (UniqueName: \"kubernetes.io/projected/6b3214ee-eee2-4b89-8491-2255fd1068be-kube-api-access-5tkqs\") pod \"cluster-logging-operator-66689c4bbf-f7fr7\" (UID: \"6b3214ee-eee2-4b89-8491-2255fd1068be\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" Mar 14 05:42:38 crc kubenswrapper[4713]: I0314 05:42:38.224556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.512372 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7"] Mar 14 05:42:39 crc kubenswrapper[4713]: W0314 05:42:39.531366 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3214ee_eee2_4b89_8491_2255fd1068be.slice/crio-e1d6d5ba53910da91ec4aa55674c75b5e0ef8ea528311b1ab1ad02eee8fef248 WatchSource:0}: Error finding container e1d6d5ba53910da91ec4aa55674c75b5e0ef8ea528311b1ab1ad02eee8fef248: Status 404 returned error can't find the container with id e1d6d5ba53910da91ec4aa55674c75b5e0ef8ea528311b1ab1ad02eee8fef248 Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.704404 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" event={"ID":"6b3214ee-eee2-4b89-8491-2255fd1068be","Type":"ContainerStarted","Data":"e1d6d5ba53910da91ec4aa55674c75b5e0ef8ea528311b1ab1ad02eee8fef248"} Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.706333 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" event={"ID":"aec4bfff-0bef-401b-9db6-f9046825614a","Type":"ContainerStarted","Data":"fb39a6454cedbde98bca052e7039f286c274d38b39918fbcaddd9bd201858c16"} Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.706735 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.708284 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" Mar 14 05:42:39 crc kubenswrapper[4713]: I0314 05:42:39.737299 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" podStartSLOduration=1.134796801 podStartE2EDuration="13.737279024s" podCreationTimestamp="2026-03-14 05:42:26 +0000 UTC" firstStartedPulling="2026-03-14 05:42:26.680358818 +0000 UTC m=+929.768268118" lastFinishedPulling="2026-03-14 05:42:39.282841041 +0000 UTC m=+942.370750341" observedRunningTime="2026-03-14 05:42:39.731340627 +0000 UTC m=+942.819249937" watchObservedRunningTime="2026-03-14 05:42:39.737279024 +0000 UTC m=+942.825188324" Mar 14 05:42:45 crc kubenswrapper[4713]: I0314 05:42:45.747721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" event={"ID":"6b3214ee-eee2-4b89-8491-2255fd1068be","Type":"ContainerStarted","Data":"342739d494a4a2e78153fe970167b611d4fb7c47d54434784d57c6444a8d727a"} Mar 14 05:42:45 crc kubenswrapper[4713]: I0314 05:42:45.765363 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-f7fr7" podStartSLOduration=3.415749806 podStartE2EDuration="8.765346097s" podCreationTimestamp="2026-03-14 05:42:37 +0000 UTC" firstStartedPulling="2026-03-14 05:42:39.530946028 +0000 UTC m=+942.618855328" lastFinishedPulling="2026-03-14 05:42:44.880542319 +0000 UTC m=+947.968451619" observedRunningTime="2026-03-14 05:42:45.763794298 +0000 UTC m=+948.851703598" watchObservedRunningTime="2026-03-14 05:42:45.765346097 +0000 UTC m=+948.853255407" Mar 14 05:42:51 crc kubenswrapper[4713]: I0314 05:42:51.496465 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 14 05:42:51 crc kubenswrapper[4713]: I0314 05:42:51.501019 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 14 05:42:51 crc kubenswrapper[4713]: I0314 05:42:51.506966 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 14 05:42:51 crc kubenswrapper[4713]: I0314 05:42:51.515743 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 14 05:42:51 crc kubenswrapper[4713]: I0314 05:42:51.515995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 14 05:42:52 crc kubenswrapper[4713]: I0314 05:42:52.271450 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbp4m\" (UniqueName: \"kubernetes.io/projected/a8e07b10-6302-4eed-9559-f6945b344f37-kube-api-access-pbp4m\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:52 crc kubenswrapper[4713]: I0314 05:42:52.272012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-236a6118-ad49-4516-9ea9-d56a716943ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-236a6118-ad49-4516-9ea9-d56a716943ac\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:52 crc kubenswrapper[4713]: I0314 05:42:52.372930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbp4m\" (UniqueName: \"kubernetes.io/projected/a8e07b10-6302-4eed-9559-f6945b344f37-kube-api-access-pbp4m\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:53 crc kubenswrapper[4713]: I0314 05:42:53.805873 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-236a6118-ad49-4516-9ea9-d56a716943ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-236a6118-ad49-4516-9ea9-d56a716943ac\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:53 crc kubenswrapper[4713]: I0314 05:42:53.812923 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbp4m\" (UniqueName: \"kubernetes.io/projected/a8e07b10-6302-4eed-9559-f6945b344f37-kube-api-access-pbp4m\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:53 crc kubenswrapper[4713]: I0314 05:42:53.824835 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:42:53 crc kubenswrapper[4713]: I0314 05:42:53.825135 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-236a6118-ad49-4516-9ea9-d56a716943ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-236a6118-ad49-4516-9ea9-d56a716943ac\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f81e169963ad84b2662f4ee72fa2d7b7783e2ffd69318c8dbf99b82028622c11/globalmount\"" pod="minio-dev/minio" Mar 14 05:42:53 crc kubenswrapper[4713]: I0314 05:42:53.964075 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-236a6118-ad49-4516-9ea9-d56a716943ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-236a6118-ad49-4516-9ea9-d56a716943ac\") pod \"minio\" (UID: \"a8e07b10-6302-4eed-9559-f6945b344f37\") " pod="minio-dev/minio" Mar 14 05:42:54 crc kubenswrapper[4713]: I0314 05:42:54.976329 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 14 05:42:58 crc kubenswrapper[4713]: I0314 05:42:58.822097 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 14 05:42:59 crc kubenswrapper[4713]: I0314 05:42:59.342822 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a8e07b10-6302-4eed-9559-f6945b344f37","Type":"ContainerStarted","Data":"efa725cafa325ae069a76ca1aafe405d050644bd19d75833df78bb1bb6f21e1b"} Mar 14 05:43:03 crc kubenswrapper[4713]: I0314 05:43:03.411185 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a8e07b10-6302-4eed-9559-f6945b344f37","Type":"ContainerStarted","Data":"8b20bd4415e66e221145a5a4c1e99c9e109d90d5c922cb6f32151b5ae71b7d0c"} Mar 14 05:43:03 crc kubenswrapper[4713]: I0314 05:43:03.434272 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=11.406605832 podStartE2EDuration="15.434184782s" podCreationTimestamp="2026-03-14 05:42:48 +0000 UTC" firstStartedPulling="2026-03-14 05:42:58.844318385 +0000 UTC m=+961.932227685" lastFinishedPulling="2026-03-14 05:43:02.871897335 +0000 UTC m=+965.959806635" observedRunningTime="2026-03-14 05:43:03.430481156 +0000 UTC m=+966.518390456" watchObservedRunningTime="2026-03-14 05:43:03.434184782 +0000 UTC m=+966.522094082" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.256254 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6"] Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.257622 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.265570 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-9tsfg" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.266995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.279154 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.279377 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.279263 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.284869 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6"] Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.338424 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgd79\" (UniqueName: \"kubernetes.io/projected/82a7870b-ac91-41f9-a94f-41db191e711b-kube-api-access-qgd79\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.338474 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.338510 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.338567 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-config\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.338710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.439614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.439682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-config\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.439709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.439765 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgd79\" (UniqueName: \"kubernetes.io/projected/82a7870b-ac91-41f9-a94f-41db191e711b-kube-api-access-qgd79\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.439789 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.440850 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-config\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.442651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.453512 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.455845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82a7870b-ac91-41f9-a94f-41db191e711b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.476165 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgd79\" (UniqueName: \"kubernetes.io/projected/82a7870b-ac91-41f9-a94f-41db191e711b-kube-api-access-qgd79\") pod \"logging-loki-distributor-9c6b6d984-h7xw6\" (UID: \"82a7870b-ac91-41f9-a94f-41db191e711b\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:10 crc kubenswrapper[4713]: I0314 05:43:10.579795 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.136966 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.138196 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.146654 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.146922 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.147070 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.178662 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.223368 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.224148 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.228930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.236961 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.246345 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.253895 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.253948 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.254016 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.254041 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-config\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.254083 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg42d\" (UniqueName: \"kubernetes.io/projected/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-kube-api-access-fg42d\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.254120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.349800 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.351103 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.355522 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.355736 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-f74pw" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.355857 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.355897 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.355961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356102 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356823 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-config\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356847 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356867 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356918 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg42d\" (UniqueName: \"kubernetes.io/projected/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-kube-api-access-fg42d\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356936 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356963 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.356992 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrwn\" (UniqueName: \"kubernetes.io/projected/1d1f7414-07c8-48ab-bc8b-3892473aa10f-kube-api-access-srrwn\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.361713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-config\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.365598 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.365924 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.365951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.369794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.381971 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-98zs2"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.383672 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.397362 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg42d\" (UniqueName: \"kubernetes.io/projected/ed72a9eb-a4ee-430c-9449-566f2c56c3bf-kube-api-access-fg42d\") pod \"logging-loki-querier-6dcbdf8bb8-rbp8f\" (UID: \"ed72a9eb-a4ee-430c-9449-566f2c56c3bf\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.400254 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.421079 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-98zs2"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-rbac\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458342 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tenants\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458367 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458390 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458415 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqxm\" (UniqueName: \"kubernetes.io/projected/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-kube-api-access-swqxm\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458462 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpwh\" (UniqueName: \"kubernetes.io/projected/d77ba467-d131-42b6-9297-e30cbb7d9c57-kube-api-access-bvpwh\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458480 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458495 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-rbac\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458521 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458559 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458586 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458645 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458708 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tenants\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrwn\" (UniqueName: \"kubernetes.io/projected/1d1f7414-07c8-48ab-bc8b-3892473aa10f-kube-api-access-srrwn\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458747 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458773 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.458797 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.463064 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.463105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f7414-07c8-48ab-bc8b-3892473aa10f-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.476437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.477026 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.477953 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1d1f7414-07c8-48ab-bc8b-3892473aa10f-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.494138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrwn\" (UniqueName: \"kubernetes.io/projected/1d1f7414-07c8-48ab-bc8b-3892473aa10f-kube-api-access-srrwn\") pod \"logging-loki-query-frontend-ff66c4dc9-5d4kx\" (UID: \"1d1f7414-07c8-48ab-bc8b-3892473aa10f\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.531152 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6"] Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.553840 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560661 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560707 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tenants\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560760 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-rbac\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560834 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tenants\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560857 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560884 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560910 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqxm\" (UniqueName: \"kubernetes.io/projected/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-kube-api-access-swqxm\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpwh\" (UniqueName: \"kubernetes.io/projected/d77ba467-d131-42b6-9297-e30cbb7d9c57-kube-api-access-bvpwh\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560956 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560970 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-rbac\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.560990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.561012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.561055 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.562198 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.564599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.565158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.567241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.567987 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.579490 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d77ba467-d131-42b6-9297-e30cbb7d9c57-tenants\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.581733 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.581740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tenants\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.586144 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-rbac\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.586319 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpwh\" (UniqueName: \"kubernetes.io/projected/d77ba467-d131-42b6-9297-e30cbb7d9c57-kube-api-access-bvpwh\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.587054 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.590349 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-lokistack-gateway\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.594915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-logging-loki-ca-bundle\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.596443 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-tls-secret\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.597034 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d77ba467-d131-42b6-9297-e30cbb7d9c57-rbac\") pod \"logging-loki-gateway-54c568c9c8-98zs2\" (UID: \"d77ba467-d131-42b6-9297-e30cbb7d9c57\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.607421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqxm\" (UniqueName: \"kubernetes.io/projected/8eed3eb1-25e3-4d02-b5fd-d8f691af6c21-kube-api-access-swqxm\") pod \"logging-loki-gateway-54c568c9c8-mwlnd\" (UID: \"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21\") " pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.706443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.710135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" event={"ID":"82a7870b-ac91-41f9-a94f-41db191e711b","Type":"ContainerStarted","Data":"965599d0f2577d329bd83cf00a64732dfcbf4be554a6df5eceabca51cef18bd9"} Mar 14 05:43:11 crc kubenswrapper[4713]: I0314 05:43:11.713265 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.001792 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f"] Mar 14 05:43:12 crc kubenswrapper[4713]: W0314 05:43:12.003991 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded72a9eb_a4ee_430c_9449_566f2c56c3bf.slice/crio-46a380341b30b9a4205f1ab979dafb5fe10c50248f2df44df8ddf244d612a5e5 WatchSource:0}: Error finding container 46a380341b30b9a4205f1ab979dafb5fe10c50248f2df44df8ddf244d612a5e5: Status 404 returned error can't find the container with id 46a380341b30b9a4205f1ab979dafb5fe10c50248f2df44df8ddf244d612a5e5 Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.108831 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.141682 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.142584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.148542 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.148704 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.170743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.268980 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.270110 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271679 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271713 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271742 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-config\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271787 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72fdc5da-cef5-4245-a008-b32a01be29af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72fdc5da-cef5-4245-a008-b32a01be29af\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfl99\" (UniqueName: \"kubernetes.io/projected/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-kube-api-access-cfl99\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271827 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.271849 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.275437 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.279852 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.280093 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.304306 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.354648 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.355785 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.361009 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.371515 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374167 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374232 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374305 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374340 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374368 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374412 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374439 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-config\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374479 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-config\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374532 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374558 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374584 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72fdc5da-cef5-4245-a008-b32a01be29af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72fdc5da-cef5-4245-a008-b32a01be29af\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56k69\" (UniqueName: \"kubernetes.io/projected/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-kube-api-access-56k69\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.374680 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfl99\" (UniqueName: \"kubernetes.io/projected/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-kube-api-access-cfl99\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.376173 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-config\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.376945 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.378246 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-54c568c9c8-98zs2"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.386601 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.386687 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03dde04b72c68bfd63a9f7f825c28a1e1b3a5e3f9e75dc60b1e163be40e9412e/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.388780 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.388836 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72fdc5da-cef5-4245-a008-b32a01be29af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72fdc5da-cef5-4245-a008-b32a01be29af\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d7c398a4f2baf486d91d185266622117bcf6541a048665fa23a433f81f41967/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.391420 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.391596 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.393058 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.397908 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.409219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfl99\" (UniqueName: \"kubernetes.io/projected/f04ba68c-50bf-406f-977f-7cf9b7d1f4b4-kube-api-access-cfl99\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.435088 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d86f0c19-27d0-43bf-8b7d-e69b3fe13397\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.435539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72fdc5da-cef5-4245-a008-b32a01be29af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72fdc5da-cef5-4245-a008-b32a01be29af\") pod \"logging-loki-ingester-0\" (UID: \"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476507 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476596 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476662 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476709 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56k69\" (UniqueName: \"kubernetes.io/projected/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-kube-api-access-56k69\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476811 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfddr\" (UniqueName: \"kubernetes.io/projected/ac2f3622-77e0-46da-95dc-1a17548790a7-kube-api-access-vfddr\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476859 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-config\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476916 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476950 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.476989 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.479769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.479998 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.480591 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-config\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.480854 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.481611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.483399 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.483433 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f385f873ce9d174649ad04b6a0ebad9c7d74febedf471062acdf6b83e4424734/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.492467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.497797 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56k69\" (UniqueName: \"kubernetes.io/projected/fd0e6ea3-0887-4eba-a83d-f76a405b0d56-kube-api-access-56k69\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.512749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-12096e3b-ceeb-4544-bc52-ffd2233bef18\") pod \"logging-loki-compactor-0\" (UID: \"fd0e6ea3-0887-4eba-a83d-f76a405b0d56\") " pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.578700 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.578808 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfddr\" (UniqueName: \"kubernetes.io/projected/ac2f3622-77e0-46da-95dc-1a17548790a7-kube-api-access-vfddr\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.578877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.578904 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.578948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.579004 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.579055 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.581595 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.582259 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.582290 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/461382e02dab89f056bc8d120ef9b32914de870aa985d80ade68bab56c0cbcf0/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.582419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2f3622-77e0-46da-95dc-1a17548790a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.583632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.587412 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.594532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ac2f3622-77e0-46da-95dc-1a17548790a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.613558 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.614941 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfddr\" (UniqueName: \"kubernetes.io/projected/ac2f3622-77e0-46da-95dc-1a17548790a7-kube-api-access-vfddr\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.623574 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a81539e-94e4-4598-90f1-0f179be2eaeb\") pod \"logging-loki-index-gateway-0\" (UID: \"ac2f3622-77e0-46da-95dc-1a17548790a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.688688 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.725017 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" event={"ID":"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21","Type":"ContainerStarted","Data":"da19c914a8d980ab617a024d8698c21930de0325beeef38448aad91a07121185"} Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.726024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" event={"ID":"1d1f7414-07c8-48ab-bc8b-3892473aa10f","Type":"ContainerStarted","Data":"a5fc51311f08f9c76f531c6caed7dc16150a33dfe24d9a475ec2eb899d552c20"} Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.726902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" event={"ID":"d77ba467-d131-42b6-9297-e30cbb7d9c57","Type":"ContainerStarted","Data":"3bf72bb30a6efc0070282ea5cd1f43c5d12de2f93c46d33c306ad7c43f5117e3"} Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.728035 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" event={"ID":"ed72a9eb-a4ee-430c-9449-566f2c56c3bf","Type":"ContainerStarted","Data":"46a380341b30b9a4205f1ab979dafb5fe10c50248f2df44df8ddf244d612a5e5"} Mar 14 05:43:12 crc kubenswrapper[4713]: I0314 05:43:12.939317 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 14 05:43:12 crc kubenswrapper[4713]: W0314 05:43:12.943365 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04ba68c_50bf_406f_977f_7cf9b7d1f4b4.slice/crio-58a85b0b018cc1c21b75432e70e95928ba21afbdbb9a3900f80f8a0e627c0e66 WatchSource:0}: Error finding container 58a85b0b018cc1c21b75432e70e95928ba21afbdbb9a3900f80f8a0e627c0e66: Status 404 returned error can't find the container with id 58a85b0b018cc1c21b75432e70e95928ba21afbdbb9a3900f80f8a0e627c0e66 Mar 14 05:43:13 crc kubenswrapper[4713]: I0314 05:43:13.090088 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 14 05:43:13 crc kubenswrapper[4713]: I0314 05:43:13.119629 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 14 05:43:13 crc kubenswrapper[4713]: W0314 05:43:13.122954 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac2f3622_77e0_46da_95dc_1a17548790a7.slice/crio-8992d05b6d6826c8e4102cfee9a43ce0260e397d0197a8bd2c41d9bd4b13908e WatchSource:0}: Error finding container 8992d05b6d6826c8e4102cfee9a43ce0260e397d0197a8bd2c41d9bd4b13908e: Status 404 returned error can't find the container with id 8992d05b6d6826c8e4102cfee9a43ce0260e397d0197a8bd2c41d9bd4b13908e Mar 14 05:43:13 crc kubenswrapper[4713]: I0314 05:43:13.735562 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4","Type":"ContainerStarted","Data":"58a85b0b018cc1c21b75432e70e95928ba21afbdbb9a3900f80f8a0e627c0e66"} Mar 14 05:43:13 crc kubenswrapper[4713]: I0314 05:43:13.736868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"fd0e6ea3-0887-4eba-a83d-f76a405b0d56","Type":"ContainerStarted","Data":"891d1a991692941560eee893fd422ed5753150f127c46a6e6307a81c0d68da0e"} Mar 14 05:43:13 crc kubenswrapper[4713]: I0314 05:43:13.738026 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"ac2f3622-77e0-46da-95dc-1a17548790a7","Type":"ContainerStarted","Data":"8992d05b6d6826c8e4102cfee9a43ce0260e397d0197a8bd2c41d9bd4b13908e"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.818437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" event={"ID":"d77ba467-d131-42b6-9297-e30cbb7d9c57","Type":"ContainerStarted","Data":"8c9c770da135ca3b090346fee03f8836a5690352286d76ec1fe41c0122192b19"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.820578 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" event={"ID":"ed72a9eb-a4ee-430c-9449-566f2c56c3bf","Type":"ContainerStarted","Data":"6f5eed7c1d02ec920b3b8ae022d157df67f32b9bdf73462a81e6c050172b23f0"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.820674 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.822988 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" event={"ID":"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21","Type":"ContainerStarted","Data":"926fa5db37e186bbb061aa86a44f51c8b39e310623fc2fa15e1bf0f2b6333843"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.824696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" event={"ID":"82a7870b-ac91-41f9-a94f-41db191e711b","Type":"ContainerStarted","Data":"c41c9ef7f8317770f627072804ac081336cb1608ca69c81918731d575aaf3059"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.824768 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.826479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f04ba68c-50bf-406f-977f-7cf9b7d1f4b4","Type":"ContainerStarted","Data":"3d033f95ceb3f8741fe9a8ed6ab3ccbbfc2055b6b2b6ba0ccdd7f647aae990ce"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.826619 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.827647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"fd0e6ea3-0887-4eba-a83d-f76a405b0d56","Type":"ContainerStarted","Data":"8c2221c5fa349347b58f77ffdbf36ca1fe26d356b8bdd54fb24e5da6e7dc5e9f"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.827751 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.829573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"ac2f3622-77e0-46da-95dc-1a17548790a7","Type":"ContainerStarted","Data":"9d81b947907a8f24deda15af31c7ebfb47d0225c405352d7274e1271c52c1f15"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.829698 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.830788 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" event={"ID":"1d1f7414-07c8-48ab-bc8b-3892473aa10f","Type":"ContainerStarted","Data":"7151674d57bda36d15fa600155a0443e3d295ce1cb9b8e4859debe69456de668"} Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.830915 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.851247 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" podStartSLOduration=2.641858738 podStartE2EDuration="13.85122978s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:12.006195894 +0000 UTC m=+975.094105194" lastFinishedPulling="2026-03-14 05:43:23.215566936 +0000 UTC m=+986.303476236" observedRunningTime="2026-03-14 05:43:24.850092375 +0000 UTC m=+987.938001685" watchObservedRunningTime="2026-03-14 05:43:24.85122978 +0000 UTC m=+987.939139080" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.889170 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podStartSLOduration=3.00498801 podStartE2EDuration="14.88914916s" podCreationTimestamp="2026-03-14 05:43:10 +0000 UTC" firstStartedPulling="2026-03-14 05:43:11.551438972 +0000 UTC m=+974.639348272" lastFinishedPulling="2026-03-14 05:43:23.435600112 +0000 UTC m=+986.523509422" observedRunningTime="2026-03-14 05:43:24.884664709 +0000 UTC m=+987.972574009" watchObservedRunningTime="2026-03-14 05:43:24.88914916 +0000 UTC m=+987.977058460" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.942911 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.024356711 podStartE2EDuration="13.942883976s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:12.945370579 +0000 UTC m=+976.033279879" lastFinishedPulling="2026-03-14 05:43:23.863897834 +0000 UTC m=+986.951807144" observedRunningTime="2026-03-14 05:43:24.915264189 +0000 UTC m=+988.003173489" watchObservedRunningTime="2026-03-14 05:43:24.942883976 +0000 UTC m=+988.030793286" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.943528 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.335176157 podStartE2EDuration="13.943522826s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:13.125312577 +0000 UTC m=+976.213221877" lastFinishedPulling="2026-03-14 05:43:23.733659246 +0000 UTC m=+986.821568546" observedRunningTime="2026-03-14 05:43:24.940524312 +0000 UTC m=+988.028433612" watchObservedRunningTime="2026-03-14 05:43:24.943522826 +0000 UTC m=+988.031432136" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.965270 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" podStartSLOduration=2.2350514009999998 podStartE2EDuration="13.965246959s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:12.135772831 +0000 UTC m=+975.223682131" lastFinishedPulling="2026-03-14 05:43:23.865968389 +0000 UTC m=+986.953877689" observedRunningTime="2026-03-14 05:43:24.962689498 +0000 UTC m=+988.050598798" watchObservedRunningTime="2026-03-14 05:43:24.965246959 +0000 UTC m=+988.053156269" Mar 14 05:43:24 crc kubenswrapper[4713]: I0314 05:43:24.997248 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.260266325 podStartE2EDuration="13.997230102s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:13.10823042 +0000 UTC m=+976.196139730" lastFinishedPulling="2026-03-14 05:43:23.845194207 +0000 UTC m=+986.933103507" observedRunningTime="2026-03-14 05:43:24.996056405 +0000 UTC m=+988.083965705" watchObservedRunningTime="2026-03-14 05:43:24.997230102 +0000 UTC m=+988.085139402" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.851385 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" event={"ID":"8eed3eb1-25e3-4d02-b5fd-d8f691af6c21","Type":"ContainerStarted","Data":"af994bdafd3dae741a588c0fda83e27fed3de39883481e52a431abc6387bb3d5"} Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.851792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.854118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" event={"ID":"d77ba467-d131-42b6-9297-e30cbb7d9c57","Type":"ContainerStarted","Data":"1435e467abc8aa9d2cc665cade9ae646c04cacfa7f13b3931f8789a1358ad2a1"} Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.854639 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.854770 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.861891 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.862773 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.865666 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.876633 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podStartSLOduration=1.674308571 podStartE2EDuration="15.876612814s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:12.293575313 +0000 UTC m=+975.381484613" lastFinishedPulling="2026-03-14 05:43:26.495879556 +0000 UTC m=+989.583788856" observedRunningTime="2026-03-14 05:43:26.874444017 +0000 UTC m=+989.962353317" watchObservedRunningTime="2026-03-14 05:43:26.876612814 +0000 UTC m=+989.964522114" Mar 14 05:43:26 crc kubenswrapper[4713]: I0314 05:43:26.903045 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podStartSLOduration=1.799421318 podStartE2EDuration="15.903023003s" podCreationTimestamp="2026-03-14 05:43:11 +0000 UTC" firstStartedPulling="2026-03-14 05:43:12.38779981 +0000 UTC m=+975.475709110" lastFinishedPulling="2026-03-14 05:43:26.491401495 +0000 UTC m=+989.579310795" observedRunningTime="2026-03-14 05:43:26.899934606 +0000 UTC m=+989.987843916" watchObservedRunningTime="2026-03-14 05:43:26.903023003 +0000 UTC m=+989.990932303" Mar 14 05:43:27 crc kubenswrapper[4713]: I0314 05:43:27.862229 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:27 crc kubenswrapper[4713]: I0314 05:43:27.873239 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.586904 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.593287 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.607750 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.714509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.714580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.714687 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bgx\" (UniqueName: \"kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.816416 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.816494 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.816555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bgx\" (UniqueName: \"kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.817107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.817058 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.844094 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bgx\" (UniqueName: \"kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx\") pod \"certified-operators-4brsp\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:29 crc kubenswrapper[4713]: I0314 05:43:29.921603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:30 crc kubenswrapper[4713]: I0314 05:43:30.381418 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:30 crc kubenswrapper[4713]: I0314 05:43:30.881681 4713 generic.go:334] "Generic (PLEG): container finished" podID="bbce4dce-36da-4039-92af-99123f46b0e9" containerID="ec42604e290bb2af04558ca7a2b2968da5271c0323690fad87fdc09ef7f2040f" exitCode=0 Mar 14 05:43:30 crc kubenswrapper[4713]: I0314 05:43:30.881739 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerDied","Data":"ec42604e290bb2af04558ca7a2b2968da5271c0323690fad87fdc09ef7f2040f"} Mar 14 05:43:30 crc kubenswrapper[4713]: I0314 05:43:30.881771 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerStarted","Data":"1f230ba5068ce7dd2fbc98b8645b86c8f66787b898d13286b281ba1086be2ca0"} Mar 14 05:43:32 crc kubenswrapper[4713]: I0314 05:43:32.896562 4713 generic.go:334] "Generic (PLEG): container finished" podID="bbce4dce-36da-4039-92af-99123f46b0e9" containerID="2897cf70167cd6c84b6f1b0d700238caf8ca332385da2adfaa8416f58db5818c" exitCode=0 Mar 14 05:43:32 crc kubenswrapper[4713]: I0314 05:43:32.897094 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerDied","Data":"2897cf70167cd6c84b6f1b0d700238caf8ca332385da2adfaa8416f58db5818c"} Mar 14 05:43:35 crc kubenswrapper[4713]: I0314 05:43:35.921591 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerStarted","Data":"d3741d901dd286920e651290b6b8f2b48ddcb04258cbfe35af2850f6c63bba36"} Mar 14 05:43:35 crc kubenswrapper[4713]: I0314 05:43:35.948304 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4brsp" podStartSLOduration=3.778714445 podStartE2EDuration="6.948269507s" podCreationTimestamp="2026-03-14 05:43:29 +0000 UTC" firstStartedPulling="2026-03-14 05:43:30.883720333 +0000 UTC m=+993.971629633" lastFinishedPulling="2026-03-14 05:43:34.053275395 +0000 UTC m=+997.141184695" observedRunningTime="2026-03-14 05:43:35.941582208 +0000 UTC m=+999.029491508" watchObservedRunningTime="2026-03-14 05:43:35.948269507 +0000 UTC m=+999.036178807" Mar 14 05:43:39 crc kubenswrapper[4713]: I0314 05:43:39.921916 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:39 crc kubenswrapper[4713]: I0314 05:43:39.922967 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:40 crc kubenswrapper[4713]: I0314 05:43:40.024151 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:40 crc kubenswrapper[4713]: I0314 05:43:40.099220 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:40 crc kubenswrapper[4713]: I0314 05:43:40.284304 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:40 crc kubenswrapper[4713]: I0314 05:43:40.589143 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 05:43:41 crc kubenswrapper[4713]: I0314 05:43:41.489071 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" Mar 14 05:43:41 crc kubenswrapper[4713]: I0314 05:43:41.578119 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" Mar 14 05:43:41 crc kubenswrapper[4713]: I0314 05:43:41.979815 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4brsp" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="registry-server" containerID="cri-o://d3741d901dd286920e651290b6b8f2b48ddcb04258cbfe35af2850f6c63bba36" gracePeriod=2 Mar 14 05:43:42 crc kubenswrapper[4713]: I0314 05:43:42.545358 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 14 05:43:42 crc kubenswrapper[4713]: I0314 05:43:42.545685 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 05:43:42 crc kubenswrapper[4713]: I0314 05:43:42.695807 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 14 05:43:42 crc kubenswrapper[4713]: I0314 05:43:42.707376 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.006718 4713 generic.go:334] "Generic (PLEG): container finished" podID="bbce4dce-36da-4039-92af-99123f46b0e9" containerID="d3741d901dd286920e651290b6b8f2b48ddcb04258cbfe35af2850f6c63bba36" exitCode=0 Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.006784 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerDied","Data":"d3741d901dd286920e651290b6b8f2b48ddcb04258cbfe35af2850f6c63bba36"} Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.096337 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.250557 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content\") pod \"bbce4dce-36da-4039-92af-99123f46b0e9\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.250633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities\") pod \"bbce4dce-36da-4039-92af-99123f46b0e9\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.250916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bgx\" (UniqueName: \"kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx\") pod \"bbce4dce-36da-4039-92af-99123f46b0e9\" (UID: \"bbce4dce-36da-4039-92af-99123f46b0e9\") " Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.251873 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities" (OuterVolumeSpecName: "utilities") pod "bbce4dce-36da-4039-92af-99123f46b0e9" (UID: "bbce4dce-36da-4039-92af-99123f46b0e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.261350 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx" (OuterVolumeSpecName: "kube-api-access-l5bgx") pod "bbce4dce-36da-4039-92af-99123f46b0e9" (UID: "bbce4dce-36da-4039-92af-99123f46b0e9"). InnerVolumeSpecName "kube-api-access-l5bgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.308619 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbce4dce-36da-4039-92af-99123f46b0e9" (UID: "bbce4dce-36da-4039-92af-99123f46b0e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.352757 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bgx\" (UniqueName: \"kubernetes.io/projected/bbce4dce-36da-4039-92af-99123f46b0e9-kube-api-access-l5bgx\") on node \"crc\" DevicePath \"\"" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.352803 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:43:43 crc kubenswrapper[4713]: I0314 05:43:43.352818 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbce4dce-36da-4039-92af-99123f46b0e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.017752 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4brsp" event={"ID":"bbce4dce-36da-4039-92af-99123f46b0e9","Type":"ContainerDied","Data":"1f230ba5068ce7dd2fbc98b8645b86c8f66787b898d13286b281ba1086be2ca0"} Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.017841 4713 scope.go:117] "RemoveContainer" containerID="d3741d901dd286920e651290b6b8f2b48ddcb04258cbfe35af2850f6c63bba36" Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.017869 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4brsp" Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.045822 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.050135 4713 scope.go:117] "RemoveContainer" containerID="2897cf70167cd6c84b6f1b0d700238caf8ca332385da2adfaa8416f58db5818c" Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.053809 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4brsp"] Mar 14 05:43:44 crc kubenswrapper[4713]: I0314 05:43:44.070978 4713 scope.go:117] "RemoveContainer" containerID="ec42604e290bb2af04558ca7a2b2968da5271c0323690fad87fdc09ef7f2040f" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.575158 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" path="/var/lib/kubelet/pods/bbce4dce-36da-4039-92af-99123f46b0e9/volumes" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.690719 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:43:45 crc kubenswrapper[4713]: E0314 05:43:45.691067 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="extract-utilities" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.691091 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="extract-utilities" Mar 14 05:43:45 crc kubenswrapper[4713]: E0314 05:43:45.691108 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="registry-server" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.691117 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="registry-server" Mar 14 05:43:45 crc kubenswrapper[4713]: E0314 05:43:45.691140 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="extract-content" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.691145 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="extract-content" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.691349 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbce4dce-36da-4039-92af-99123f46b0e9" containerName="registry-server" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.692535 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.715588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.794049 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.794154 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpbsg\" (UniqueName: \"kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.794224 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.895633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.895733 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.895829 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpbsg\" (UniqueName: \"kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.896290 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.896478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:45 crc kubenswrapper[4713]: I0314 05:43:45.917387 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpbsg\" (UniqueName: \"kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg\") pod \"community-operators-qdvdh\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:46 crc kubenswrapper[4713]: I0314 05:43:46.026059 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:43:46 crc kubenswrapper[4713]: I0314 05:43:46.388656 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:43:47 crc kubenswrapper[4713]: I0314 05:43:47.059008 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d499717-5834-4ebd-8992-b971da501c36" containerID="c94a9b5dbe45e62c77dcd0ecfddce2f101941fbf0c3bac1eaa66107d30f0ab38" exitCode=0 Mar 14 05:43:47 crc kubenswrapper[4713]: I0314 05:43:47.059053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerDied","Data":"c94a9b5dbe45e62c77dcd0ecfddce2f101941fbf0c3bac1eaa66107d30f0ab38"} Mar 14 05:43:47 crc kubenswrapper[4713]: I0314 05:43:47.059081 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerStarted","Data":"ba0b88225d4e28f57351cb7587b57d416feb00aa4a8d26ac12ce33996fe916db"} Mar 14 05:43:52 crc kubenswrapper[4713]: I0314 05:43:52.498799 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 14 05:43:52 crc kubenswrapper[4713]: I0314 05:43:52.499340 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 05:43:54 crc kubenswrapper[4713]: I0314 05:43:54.112175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerStarted","Data":"84ece8250c788cfd400bafbd804a4565f11e13e0b5821f92565b190bdefb0a43"} Mar 14 05:43:55 crc kubenswrapper[4713]: I0314 05:43:55.120869 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d499717-5834-4ebd-8992-b971da501c36" containerID="84ece8250c788cfd400bafbd804a4565f11e13e0b5821f92565b190bdefb0a43" exitCode=0 Mar 14 05:43:55 crc kubenswrapper[4713]: I0314 05:43:55.120913 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerDied","Data":"84ece8250c788cfd400bafbd804a4565f11e13e0b5821f92565b190bdefb0a43"} Mar 14 05:43:57 crc kubenswrapper[4713]: I0314 05:43:57.138789 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerStarted","Data":"9de06b4f9082310f06c89de1c3b0f9130db9a7707f7c63ac6518b8d533690efb"} Mar 14 05:43:57 crc kubenswrapper[4713]: I0314 05:43:57.164395 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdvdh" podStartSLOduration=2.8177802290000002 podStartE2EDuration="12.164361959s" podCreationTimestamp="2026-03-14 05:43:45 +0000 UTC" firstStartedPulling="2026-03-14 05:43:47.061642218 +0000 UTC m=+1010.149551518" lastFinishedPulling="2026-03-14 05:43:56.408223958 +0000 UTC m=+1019.496133248" observedRunningTime="2026-03-14 05:43:57.160749415 +0000 UTC m=+1020.248658745" watchObservedRunningTime="2026-03-14 05:43:57.164361959 +0000 UTC m=+1020.252271269" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.138685 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557784-hq5bm"] Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.141540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.144367 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.145179 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.145582 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.147484 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557784-hq5bm"] Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.237014 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55tz\" (UniqueName: \"kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz\") pod \"auto-csr-approver-29557784-hq5bm\" (UID: \"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16\") " pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.337981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55tz\" (UniqueName: \"kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz\") pod \"auto-csr-approver-29557784-hq5bm\" (UID: \"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16\") " pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.361822 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55tz\" (UniqueName: \"kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz\") pod \"auto-csr-approver-29557784-hq5bm\" (UID: \"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16\") " pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.462453 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:00 crc kubenswrapper[4713]: I0314 05:44:00.914998 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557784-hq5bm"] Mar 14 05:44:01 crc kubenswrapper[4713]: I0314 05:44:01.166694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" event={"ID":"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16","Type":"ContainerStarted","Data":"be0b6bbb663755cab78b79736b87776a0e2870338175ac87304d8fa212e4c447"} Mar 14 05:44:02 crc kubenswrapper[4713]: I0314 05:44:02.498855 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 14 05:44:02 crc kubenswrapper[4713]: I0314 05:44:02.499235 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 05:44:04 crc kubenswrapper[4713]: I0314 05:44:04.196505 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" containerID="241b1ad131aaa47ac2c35ca42a749cc53027b02890f5488c3934371c9fa7dbbd" exitCode=0 Mar 14 05:44:04 crc kubenswrapper[4713]: I0314 05:44:04.196563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" event={"ID":"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16","Type":"ContainerDied","Data":"241b1ad131aaa47ac2c35ca42a749cc53027b02890f5488c3934371c9fa7dbbd"} Mar 14 05:44:05 crc kubenswrapper[4713]: I0314 05:44:05.489770 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:05 crc kubenswrapper[4713]: I0314 05:44:05.532313 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55tz\" (UniqueName: \"kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz\") pod \"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16\" (UID: \"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16\") " Mar 14 05:44:05 crc kubenswrapper[4713]: I0314 05:44:05.538814 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz" (OuterVolumeSpecName: "kube-api-access-j55tz") pod "3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" (UID: "3f430a1b-bca2-4fe3-8f22-d83f1ed50e16"). InnerVolumeSpecName "kube-api-access-j55tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:44:05 crc kubenswrapper[4713]: I0314 05:44:05.636143 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j55tz\" (UniqueName: \"kubernetes.io/projected/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16-kube-api-access-j55tz\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.026260 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.026650 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.112006 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.211324 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.211309 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557784-hq5bm" event={"ID":"3f430a1b-bca2-4fe3-8f22-d83f1ed50e16","Type":"ContainerDied","Data":"be0b6bbb663755cab78b79736b87776a0e2870338175ac87304d8fa212e4c447"} Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.211382 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0b6bbb663755cab78b79736b87776a0e2870338175ac87304d8fa212e4c447" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.266623 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.349656 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.550694 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557778-9hq7z"] Mar 14 05:44:06 crc kubenswrapper[4713]: I0314 05:44:06.557332 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557778-9hq7z"] Mar 14 05:44:07 crc kubenswrapper[4713]: I0314 05:44:07.572921 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e734759-d71c-4eec-9f82-fe7c4be7c9a6" path="/var/lib/kubelet/pods/2e734759-d71c-4eec-9f82-fe7c4be7c9a6/volumes" Mar 14 05:44:08 crc kubenswrapper[4713]: I0314 05:44:08.224085 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdvdh" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="registry-server" containerID="cri-o://9de06b4f9082310f06c89de1c3b0f9130db9a7707f7c63ac6518b8d533690efb" gracePeriod=2 Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.244928 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d499717-5834-4ebd-8992-b971da501c36" containerID="9de06b4f9082310f06c89de1c3b0f9130db9a7707f7c63ac6518b8d533690efb" exitCode=0 Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.245037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerDied","Data":"9de06b4f9082310f06c89de1c3b0f9130db9a7707f7c63ac6518b8d533690efb"} Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.560053 4713 scope.go:117] "RemoveContainer" containerID="73fd7985b8e9adbc9a5e6db119574fb4934888709ae2a92746dbe020455caffb" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.567150 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.637810 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpbsg\" (UniqueName: \"kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg\") pod \"8d499717-5834-4ebd-8992-b971da501c36\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.637899 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content\") pod \"8d499717-5834-4ebd-8992-b971da501c36\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.637989 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities\") pod \"8d499717-5834-4ebd-8992-b971da501c36\" (UID: \"8d499717-5834-4ebd-8992-b971da501c36\") " Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.639419 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities" (OuterVolumeSpecName: "utilities") pod "8d499717-5834-4ebd-8992-b971da501c36" (UID: "8d499717-5834-4ebd-8992-b971da501c36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.645456 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg" (OuterVolumeSpecName: "kube-api-access-cpbsg") pod "8d499717-5834-4ebd-8992-b971da501c36" (UID: "8d499717-5834-4ebd-8992-b971da501c36"). InnerVolumeSpecName "kube-api-access-cpbsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.695855 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d499717-5834-4ebd-8992-b971da501c36" (UID: "8d499717-5834-4ebd-8992-b971da501c36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.731751 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.731833 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.739667 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpbsg\" (UniqueName: \"kubernetes.io/projected/8d499717-5834-4ebd-8992-b971da501c36-kube-api-access-cpbsg\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.739709 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:10 crc kubenswrapper[4713]: I0314 05:44:10.739720 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d499717-5834-4ebd-8992-b971da501c36-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.257548 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdvdh" event={"ID":"8d499717-5834-4ebd-8992-b971da501c36","Type":"ContainerDied","Data":"ba0b88225d4e28f57351cb7587b57d416feb00aa4a8d26ac12ce33996fe916db"} Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.257603 4713 scope.go:117] "RemoveContainer" containerID="9de06b4f9082310f06c89de1c3b0f9130db9a7707f7c63ac6518b8d533690efb" Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.257766 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdvdh" Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.298288 4713 scope.go:117] "RemoveContainer" containerID="84ece8250c788cfd400bafbd804a4565f11e13e0b5821f92565b190bdefb0a43" Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.342627 4713 scope.go:117] "RemoveContainer" containerID="c94a9b5dbe45e62c77dcd0ecfddce2f101941fbf0c3bac1eaa66107d30f0ab38" Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.387790 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.419945 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdvdh"] Mar 14 05:44:11 crc kubenswrapper[4713]: I0314 05:44:11.571808 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d499717-5834-4ebd-8992-b971da501c36" path="/var/lib/kubelet/pods/8d499717-5834-4ebd-8992-b971da501c36/volumes" Mar 14 05:44:12 crc kubenswrapper[4713]: I0314 05:44:12.498756 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 14 05:44:12 crc kubenswrapper[4713]: I0314 05:44:12.498814 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 05:44:22 crc kubenswrapper[4713]: I0314 05:44:22.498304 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.075807 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-dcnzv"] Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.076773 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" containerName="oc" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.076792 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" containerName="oc" Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.076817 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="extract-content" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.076825 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="extract-content" Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.076840 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="registry-server" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.076849 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="registry-server" Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.076861 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="extract-utilities" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.076869 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="extract-utilities" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.077018 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" containerName="oc" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.077036 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d499717-5834-4ebd-8992-b971da501c36" containerName="registry-server" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.078268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.080960 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.081080 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.081275 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-hxn74" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.085839 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.086087 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.092579 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-dcnzv"] Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.100373 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224335 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224377 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224417 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224440 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png2t\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224830 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.224991 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.225362 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.225407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.225472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.225501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.241684 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-dcnzv"] Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.242457 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-png2t metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-dcnzv" podUID="920b9743-48f9-4b1f-a747-8e2bd580c684" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.328056 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.328575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.328238 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.328782 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.328916 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png2t\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.329077 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.329149 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.329431 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.329736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.330159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.330353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.330482 4713 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.330546 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics podName:920b9743-48f9-4b1f-a747-8e2bd580c684 nodeName:}" failed. No retries permitted until 2026-03-14 05:44:31.830525296 +0000 UTC m=+1054.918434596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics") pod "collector-dcnzv" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684") : secret "collector-metrics" not found Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.330490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.330608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.330706 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.330781 4713 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 14 05:44:31 crc kubenswrapper[4713]: E0314 05:44:31.330823 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver podName:920b9743-48f9-4b1f-a747-8e2bd580c684 nodeName:}" failed. No retries permitted until 2026-03-14 05:44:31.830808756 +0000 UTC m=+1054.918718056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver") pod "collector-dcnzv" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684") : secret "collector-syslog-receiver" not found Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.331419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.331547 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.336605 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.343635 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.346500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png2t\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.349079 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.400954 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.413416 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.533162 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.533671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.533802 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.533942 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534050 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534234 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534406 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-png2t\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534572 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.533813 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config" (OuterVolumeSpecName: "config") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.534678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir" (OuterVolumeSpecName: "datadir") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.535077 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.535536 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.535748 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.535853 4713 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/920b9743-48f9-4b1f-a747-8e2bd580c684-datadir\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.535947 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.536325 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.539465 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token" (OuterVolumeSpecName: "collector-token") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.544478 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t" (OuterVolumeSpecName: "kube-api-access-png2t") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "kube-api-access-png2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.545419 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token" (OuterVolumeSpecName: "sa-token") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.546416 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp" (OuterVolumeSpecName: "tmp") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637729 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-png2t\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-kube-api-access-png2t\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637780 4713 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/920b9743-48f9-4b1f-a747-8e2bd580c684-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637795 4713 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/920b9743-48f9-4b1f-a747-8e2bd580c684-tmp\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637805 4713 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-token\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637817 4713 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.637829 4713 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/920b9743-48f9-4b1f-a747-8e2bd580c684-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.841559 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.841901 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.845880 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.845916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") pod \"collector-dcnzv\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " pod="openshift-logging/collector-dcnzv" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.942479 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.942582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") pod \"920b9743-48f9-4b1f-a747-8e2bd580c684\" (UID: \"920b9743-48f9-4b1f-a747-8e2bd580c684\") " Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.945284 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics" (OuterVolumeSpecName: "metrics") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:44:31 crc kubenswrapper[4713]: I0314 05:44:31.945311 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "920b9743-48f9-4b1f-a747-8e2bd580c684" (UID: "920b9743-48f9-4b1f-a747-8e2bd580c684"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.044515 4713 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.044544 4713 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/920b9743-48f9-4b1f-a747-8e2bd580c684-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.407536 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dcnzv" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.466960 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-dcnzv"] Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.475021 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-dcnzv"] Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.479953 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-7q2cm"] Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.495487 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.497383 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-7q2cm"] Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.503166 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.503549 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.503564 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-hxn74" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.503587 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.504179 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.510254 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-entrypoint\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654255 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-metrics\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-tmp\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654342 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-trusted-ca\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654393 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsnx\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-kube-api-access-lcsnx\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654463 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-datadir\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654525 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-syslog-receiver\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654575 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-sa-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654598 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.654646 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config-openshift-service-cacrt\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756321 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-trusted-ca\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsnx\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-kube-api-access-lcsnx\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756481 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-datadir\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756520 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-syslog-receiver\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-sa-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756610 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756683 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-datadir\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.756989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config-openshift-service-cacrt\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.757034 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-entrypoint\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.757076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-tmp\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.757095 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-metrics\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.758013 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.758287 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-entrypoint\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.758292 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-trusted-ca\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.758656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-config-openshift-service-cacrt\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.761054 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.771456 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-tmp\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.771728 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-metrics\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.771872 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-collector-syslog-receiver\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.774339 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsnx\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-kube-api-access-lcsnx\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.774418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f-sa-token\") pod \"collector-7q2cm\" (UID: \"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f\") " pod="openshift-logging/collector-7q2cm" Mar 14 05:44:32 crc kubenswrapper[4713]: I0314 05:44:32.820730 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7q2cm" Mar 14 05:44:33 crc kubenswrapper[4713]: I0314 05:44:33.246449 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-7q2cm"] Mar 14 05:44:33 crc kubenswrapper[4713]: I0314 05:44:33.415456 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-7q2cm" event={"ID":"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f","Type":"ContainerStarted","Data":"250f67117c9d5b0a7800add912036384dbd296939fa69830ed48e5ec9691f780"} Mar 14 05:44:33 crc kubenswrapper[4713]: I0314 05:44:33.572310 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920b9743-48f9-4b1f-a747-8e2bd580c684" path="/var/lib/kubelet/pods/920b9743-48f9-4b1f-a747-8e2bd580c684/volumes" Mar 14 05:44:39 crc kubenswrapper[4713]: I0314 05:44:39.464963 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-7q2cm" event={"ID":"3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f","Type":"ContainerStarted","Data":"6a6db769d879f589d3cfa5ec22aaf478059c5795a095ceee6ab88a23a41af9e8"} Mar 14 05:44:39 crc kubenswrapper[4713]: I0314 05:44:39.497517 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-7q2cm" podStartSLOduration=2.32739116 podStartE2EDuration="7.497487056s" podCreationTimestamp="2026-03-14 05:44:32 +0000 UTC" firstStartedPulling="2026-03-14 05:44:33.260675392 +0000 UTC m=+1056.348584692" lastFinishedPulling="2026-03-14 05:44:38.430771288 +0000 UTC m=+1061.518680588" observedRunningTime="2026-03-14 05:44:39.489680388 +0000 UTC m=+1062.577589708" watchObservedRunningTime="2026-03-14 05:44:39.497487056 +0000 UTC m=+1062.585396366" Mar 14 05:44:40 crc kubenswrapper[4713]: I0314 05:44:40.731330 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:44:40 crc kubenswrapper[4713]: I0314 05:44:40.731839 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.145031 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h"] Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.147754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.150078 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.150308 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.157166 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h"] Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.325157 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.325307 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.325451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zkh\" (UniqueName: \"kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.426844 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.426983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.427136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zkh\" (UniqueName: \"kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.428502 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.435084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.443534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zkh\" (UniqueName: \"kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh\") pod \"collect-profiles-29557785-pc69h\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.466182 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:00 crc kubenswrapper[4713]: I0314 05:45:00.843437 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h"] Mar 14 05:45:01 crc kubenswrapper[4713]: I0314 05:45:01.854549 4713 generic.go:334] "Generic (PLEG): container finished" podID="a13aef23-d004-42c0-9e55-1e350f6cd1b0" containerID="29316001ef44b54dbf842e94e245ce492c9ba896bc4f52b7d0f37c23004ecdec" exitCode=0 Mar 14 05:45:01 crc kubenswrapper[4713]: I0314 05:45:01.854599 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" event={"ID":"a13aef23-d004-42c0-9e55-1e350f6cd1b0","Type":"ContainerDied","Data":"29316001ef44b54dbf842e94e245ce492c9ba896bc4f52b7d0f37c23004ecdec"} Mar 14 05:45:01 crc kubenswrapper[4713]: I0314 05:45:01.855005 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" event={"ID":"a13aef23-d004-42c0-9e55-1e350f6cd1b0","Type":"ContainerStarted","Data":"fb37c38b3532cb2b4760c097b0775e39f79734afd38a05215b5824c50b594fec"} Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.145948 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.271471 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume\") pod \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.271547 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7zkh\" (UniqueName: \"kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh\") pod \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.271587 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume\") pod \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\" (UID: \"a13aef23-d004-42c0-9e55-1e350f6cd1b0\") " Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.272136 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume" (OuterVolumeSpecName: "config-volume") pod "a13aef23-d004-42c0-9e55-1e350f6cd1b0" (UID: "a13aef23-d004-42c0-9e55-1e350f6cd1b0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.277533 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh" (OuterVolumeSpecName: "kube-api-access-k7zkh") pod "a13aef23-d004-42c0-9e55-1e350f6cd1b0" (UID: "a13aef23-d004-42c0-9e55-1e350f6cd1b0"). InnerVolumeSpecName "kube-api-access-k7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.278272 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a13aef23-d004-42c0-9e55-1e350f6cd1b0" (UID: "a13aef23-d004-42c0-9e55-1e350f6cd1b0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.373992 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13aef23-d004-42c0-9e55-1e350f6cd1b0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.374039 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7zkh\" (UniqueName: \"kubernetes.io/projected/a13aef23-d004-42c0-9e55-1e350f6cd1b0-kube-api-access-k7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.374054 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13aef23-d004-42c0-9e55-1e350f6cd1b0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.870281 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" event={"ID":"a13aef23-d004-42c0-9e55-1e350f6cd1b0","Type":"ContainerDied","Data":"fb37c38b3532cb2b4760c097b0775e39f79734afd38a05215b5824c50b594fec"} Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.870319 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb37c38b3532cb2b4760c097b0775e39f79734afd38a05215b5824c50b594fec" Mar 14 05:45:03 crc kubenswrapper[4713]: I0314 05:45:03.870408 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h" Mar 14 05:45:10 crc kubenswrapper[4713]: I0314 05:45:10.898917 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:45:10 crc kubenswrapper[4713]: I0314 05:45:10.900089 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:45:10 crc kubenswrapper[4713]: I0314 05:45:10.900181 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:45:10 crc kubenswrapper[4713]: I0314 05:45:10.901736 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:45:10 crc kubenswrapper[4713]: I0314 05:45:10.901811 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1" gracePeriod=600 Mar 14 05:45:12 crc kubenswrapper[4713]: I0314 05:45:12.077755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1"} Mar 14 05:45:12 crc kubenswrapper[4713]: I0314 05:45:12.078290 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1" exitCode=0 Mar 14 05:45:12 crc kubenswrapper[4713]: I0314 05:45:12.078422 4713 scope.go:117] "RemoveContainer" containerID="57dee1484521e9bfb2409914893e35a03113f50890dbf98510a8c171581cf4ea" Mar 14 05:45:12 crc kubenswrapper[4713]: I0314 05:45:12.078459 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e"} Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.178842 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx"] Mar 14 05:45:16 crc kubenswrapper[4713]: E0314 05:45:16.180277 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13aef23-d004-42c0-9e55-1e350f6cd1b0" containerName="collect-profiles" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.180294 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13aef23-d004-42c0-9e55-1e350f6cd1b0" containerName="collect-profiles" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.180466 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13aef23-d004-42c0-9e55-1e350f6cd1b0" containerName="collect-profiles" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.181986 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.187243 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.196468 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx"] Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.302302 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9j8\" (UniqueName: \"kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.302712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.302845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.404694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.404757 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9j8\" (UniqueName: \"kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.404809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.405243 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.405438 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.424732 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9j8\" (UniqueName: \"kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.503081 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:16 crc kubenswrapper[4713]: I0314 05:45:16.943607 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx"] Mar 14 05:45:17 crc kubenswrapper[4713]: I0314 05:45:17.117576 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerStarted","Data":"d4e6056300ca1d9cf1abe60e229c9a4c55e49e684479a620cda6e6276188de1c"} Mar 14 05:45:17 crc kubenswrapper[4713]: I0314 05:45:17.117627 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerStarted","Data":"8b1ea906211fc26cb6adfa3acd9b2106798bf0657907bbfc8707373bd245ad8d"} Mar 14 05:45:18 crc kubenswrapper[4713]: I0314 05:45:18.126090 4713 generic.go:334] "Generic (PLEG): container finished" podID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerID="d4e6056300ca1d9cf1abe60e229c9a4c55e49e684479a620cda6e6276188de1c" exitCode=0 Mar 14 05:45:18 crc kubenswrapper[4713]: I0314 05:45:18.126138 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerDied","Data":"d4e6056300ca1d9cf1abe60e229c9a4c55e49e684479a620cda6e6276188de1c"} Mar 14 05:45:21 crc kubenswrapper[4713]: I0314 05:45:21.145733 4713 generic.go:334] "Generic (PLEG): container finished" podID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerID="fabc7e03a716468282b6ea210315bc53ac466fa2e1e98a6f4c501f534a9bcbfc" exitCode=0 Mar 14 05:45:21 crc kubenswrapper[4713]: I0314 05:45:21.145778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerDied","Data":"fabc7e03a716468282b6ea210315bc53ac466fa2e1e98a6f4c501f534a9bcbfc"} Mar 14 05:45:22 crc kubenswrapper[4713]: I0314 05:45:22.157055 4713 generic.go:334] "Generic (PLEG): container finished" podID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerID="2a5a8cc05c2bb02c263df4b4b606c868b8eb266b9f8541cbc30c90aaa7eb5732" exitCode=0 Mar 14 05:45:22 crc kubenswrapper[4713]: I0314 05:45:22.157112 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerDied","Data":"2a5a8cc05c2bb02c263df4b4b606c868b8eb266b9f8541cbc30c90aaa7eb5732"} Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.459680 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.656521 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle\") pod \"6fc0ef8b-404f-4614-9bb1-57550994c275\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.656868 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util\") pod \"6fc0ef8b-404f-4614-9bb1-57550994c275\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.657028 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn9j8\" (UniqueName: \"kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8\") pod \"6fc0ef8b-404f-4614-9bb1-57550994c275\" (UID: \"6fc0ef8b-404f-4614-9bb1-57550994c275\") " Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.657633 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle" (OuterVolumeSpecName: "bundle") pod "6fc0ef8b-404f-4614-9bb1-57550994c275" (UID: "6fc0ef8b-404f-4614-9bb1-57550994c275"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.665087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8" (OuterVolumeSpecName: "kube-api-access-zn9j8") pod "6fc0ef8b-404f-4614-9bb1-57550994c275" (UID: "6fc0ef8b-404f-4614-9bb1-57550994c275"). InnerVolumeSpecName "kube-api-access-zn9j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.670697 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util" (OuterVolumeSpecName: "util") pod "6fc0ef8b-404f-4614-9bb1-57550994c275" (UID: "6fc0ef8b-404f-4614-9bb1-57550994c275"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.759021 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn9j8\" (UniqueName: \"kubernetes.io/projected/6fc0ef8b-404f-4614-9bb1-57550994c275-kube-api-access-zn9j8\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.759073 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:23 crc kubenswrapper[4713]: I0314 05:45:23.759089 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fc0ef8b-404f-4614-9bb1-57550994c275-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:45:24 crc kubenswrapper[4713]: I0314 05:45:24.173736 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" event={"ID":"6fc0ef8b-404f-4614-9bb1-57550994c275","Type":"ContainerDied","Data":"8b1ea906211fc26cb6adfa3acd9b2106798bf0657907bbfc8707373bd245ad8d"} Mar 14 05:45:24 crc kubenswrapper[4713]: I0314 05:45:24.173804 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx" Mar 14 05:45:24 crc kubenswrapper[4713]: I0314 05:45:24.173821 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1ea906211fc26cb6adfa3acd9b2106798bf0657907bbfc8707373bd245ad8d" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.933370 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc"] Mar 14 05:45:27 crc kubenswrapper[4713]: E0314 05:45:27.934475 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="extract" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.934489 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="extract" Mar 14 05:45:27 crc kubenswrapper[4713]: E0314 05:45:27.934504 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="pull" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.934511 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="pull" Mar 14 05:45:27 crc kubenswrapper[4713]: E0314 05:45:27.934522 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="util" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.934532 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="util" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.934682 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc0ef8b-404f-4614-9bb1-57550994c275" containerName="extract" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.935379 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.938193 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-k2qs8" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.938732 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.939252 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 14 05:45:27 crc kubenswrapper[4713]: I0314 05:45:27.942656 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc"] Mar 14 05:45:28 crc kubenswrapper[4713]: I0314 05:45:28.126002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtwg\" (UniqueName: \"kubernetes.io/projected/0069d74d-74f2-4d61-b37f-d710febf8c1e-kube-api-access-7xtwg\") pod \"nmstate-operator-796d4cfff4-t6jvc\" (UID: \"0069d74d-74f2-4d61-b37f-d710febf8c1e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" Mar 14 05:45:28 crc kubenswrapper[4713]: I0314 05:45:28.227229 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtwg\" (UniqueName: \"kubernetes.io/projected/0069d74d-74f2-4d61-b37f-d710febf8c1e-kube-api-access-7xtwg\") pod \"nmstate-operator-796d4cfff4-t6jvc\" (UID: \"0069d74d-74f2-4d61-b37f-d710febf8c1e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" Mar 14 05:45:28 crc kubenswrapper[4713]: I0314 05:45:28.265382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtwg\" (UniqueName: \"kubernetes.io/projected/0069d74d-74f2-4d61-b37f-d710febf8c1e-kube-api-access-7xtwg\") pod \"nmstate-operator-796d4cfff4-t6jvc\" (UID: \"0069d74d-74f2-4d61-b37f-d710febf8c1e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" Mar 14 05:45:28 crc kubenswrapper[4713]: I0314 05:45:28.563268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" Mar 14 05:45:29 crc kubenswrapper[4713]: I0314 05:45:29.044525 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc"] Mar 14 05:45:29 crc kubenswrapper[4713]: W0314 05:45:29.050895 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0069d74d_74f2_4d61_b37f_d710febf8c1e.slice/crio-e1790fd2d5db94351d0eb3b3658d169201ebf73effd4d252c5799f74c4ec7478 WatchSource:0}: Error finding container e1790fd2d5db94351d0eb3b3658d169201ebf73effd4d252c5799f74c4ec7478: Status 404 returned error can't find the container with id e1790fd2d5db94351d0eb3b3658d169201ebf73effd4d252c5799f74c4ec7478 Mar 14 05:45:29 crc kubenswrapper[4713]: I0314 05:45:29.223301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" event={"ID":"0069d74d-74f2-4d61-b37f-d710febf8c1e","Type":"ContainerStarted","Data":"e1790fd2d5db94351d0eb3b3658d169201ebf73effd4d252c5799f74c4ec7478"} Mar 14 05:45:33 crc kubenswrapper[4713]: I0314 05:45:33.254225 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" event={"ID":"0069d74d-74f2-4d61-b37f-d710febf8c1e","Type":"ContainerStarted","Data":"75d87d06f5b2ec874f1a596ae6e24396178288ade833f2444ddc4726057133d0"} Mar 14 05:45:33 crc kubenswrapper[4713]: I0314 05:45:33.273005 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-t6jvc" podStartSLOduration=2.805281801 podStartE2EDuration="6.272989003s" podCreationTimestamp="2026-03-14 05:45:27 +0000 UTC" firstStartedPulling="2026-03-14 05:45:29.053916182 +0000 UTC m=+1112.141825482" lastFinishedPulling="2026-03-14 05:45:32.521623384 +0000 UTC m=+1115.609532684" observedRunningTime="2026-03-14 05:45:33.271508827 +0000 UTC m=+1116.359418137" watchObservedRunningTime="2026-03-14 05:45:33.272989003 +0000 UTC m=+1116.360898303" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.389880 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.400882 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.407087 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-q4frz" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.420376 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.442526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrs9t\" (UniqueName: \"kubernetes.io/projected/01d732e4-c4d2-4b34-8335-8f5f9b2299cd-kube-api-access-zrs9t\") pod \"nmstate-metrics-9b8c8685d-xscpb\" (UID: \"01d732e4-c4d2-4b34-8335-8f5f9b2299cd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.461609 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-25r25"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.463659 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.477905 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.480386 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-cdmmg"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.481988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.502354 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-25r25"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544571 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrs9t\" (UniqueName: \"kubernetes.io/projected/01d732e4-c4d2-4b34-8335-8f5f9b2299cd-kube-api-access-zrs9t\") pod \"nmstate-metrics-9b8c8685d-xscpb\" (UID: \"01d732e4-c4d2-4b34-8335-8f5f9b2299cd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544704 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-nmstate-lock\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-ovs-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544766 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tc92\" (UniqueName: \"kubernetes.io/projected/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-kube-api-access-7tc92\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544884 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nb46\" (UniqueName: \"kubernetes.io/projected/0eb6e7a3-da24-4bd6-8850-db445903fc2a-kube-api-access-8nb46\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.544931 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-dbus-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.605474 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrs9t\" (UniqueName: \"kubernetes.io/projected/01d732e4-c4d2-4b34-8335-8f5f9b2299cd-kube-api-access-zrs9t\") pod \"nmstate-metrics-9b8c8685d-xscpb\" (UID: \"01d732e4-c4d2-4b34-8335-8f5f9b2299cd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.644354 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.645389 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646402 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-ovs-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646452 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tc92\" (UniqueName: \"kubernetes.io/projected/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-kube-api-access-7tc92\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646509 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nb46\" (UniqueName: \"kubernetes.io/projected/0eb6e7a3-da24-4bd6-8850-db445903fc2a-kube-api-access-8nb46\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-dbus-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646807 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-ovs-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646644 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.646973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-nmstate-lock\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.647096 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-nmstate-lock\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.648865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0eb6e7a3-da24-4bd6-8850-db445903fc2a-dbus-socket\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.652516 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.658377 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-psmwd" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.657545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.658717 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.658749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp"] Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.691313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tc92\" (UniqueName: \"kubernetes.io/projected/a3c3dff8-a2ea-4073-a6ca-c391aaf296d0-kube-api-access-7tc92\") pod \"nmstate-webhook-5f558f5558-25r25\" (UID: \"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.726815 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nb46\" (UniqueName: \"kubernetes.io/projected/0eb6e7a3-da24-4bd6-8850-db445903fc2a-kube-api-access-8nb46\") pod \"nmstate-handler-cdmmg\" (UID: \"0eb6e7a3-da24-4bd6-8850-db445903fc2a\") " pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.732572 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.748755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zv57\" (UniqueName: \"kubernetes.io/projected/d5bad799-6929-4d8a-ab6e-7463d787e8e0-kube-api-access-7zv57\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.748810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5bad799-6929-4d8a-ab6e-7463d787e8e0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.748889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bad799-6929-4d8a-ab6e-7463d787e8e0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.800911 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.824426 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.849993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zv57\" (UniqueName: \"kubernetes.io/projected/d5bad799-6929-4d8a-ab6e-7463d787e8e0-kube-api-access-7zv57\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.850040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5bad799-6929-4d8a-ab6e-7463d787e8e0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.850110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bad799-6929-4d8a-ab6e-7463d787e8e0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.851335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5bad799-6929-4d8a-ab6e-7463d787e8e0-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.855140 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bad799-6929-4d8a-ab6e-7463d787e8e0-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:37 crc kubenswrapper[4713]: I0314 05:45:37.882113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zv57\" (UniqueName: \"kubernetes.io/projected/d5bad799-6929-4d8a-ab6e-7463d787e8e0-kube-api-access-7zv57\") pod \"nmstate-console-plugin-86f58fcf4-bq9hp\" (UID: \"d5bad799-6929-4d8a-ab6e-7463d787e8e0\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.002898 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.004534 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.024389 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.114372 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158161 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158363 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158402 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xn64\" (UniqueName: \"kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158473 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.158488 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260245 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260347 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xn64\" (UniqueName: \"kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260374 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260399 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260466 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.260547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.261521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.262165 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.262469 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.262521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.266420 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.267154 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.288216 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xn64\" (UniqueName: \"kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64\") pod \"console-999d8b566-8h8bm\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.327273 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cdmmg" event={"ID":"0eb6e7a3-da24-4bd6-8850-db445903fc2a","Type":"ContainerStarted","Data":"a05d0e54e239e1d1c97a32259b46bb4ca977fc009e147c1ca3e627664d8548a8"} Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.350550 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.357473 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb"] Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.442104 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-25r25"] Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.589184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp"] Mar 14 05:45:38 crc kubenswrapper[4713]: I0314 05:45:38.806756 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:45:38 crc kubenswrapper[4713]: W0314 05:45:38.808327 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79b34eb_9b98_45d3_b470_c3925639b028.slice/crio-5e0ed0bd91dc24dcffc9cd139d535ecd7ad69afe9428e56b270bfdbe94068e61 WatchSource:0}: Error finding container 5e0ed0bd91dc24dcffc9cd139d535ecd7ad69afe9428e56b270bfdbe94068e61: Status 404 returned error can't find the container with id 5e0ed0bd91dc24dcffc9cd139d535ecd7ad69afe9428e56b270bfdbe94068e61 Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.337361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-999d8b566-8h8bm" event={"ID":"a79b34eb-9b98-45d3-b470-c3925639b028","Type":"ContainerStarted","Data":"bb3065bb5c298c9255c05eea1c69f4c10236bf269c53c3c06af989819c461aa8"} Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.337664 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-999d8b566-8h8bm" event={"ID":"a79b34eb-9b98-45d3-b470-c3925639b028","Type":"ContainerStarted","Data":"5e0ed0bd91dc24dcffc9cd139d535ecd7ad69afe9428e56b270bfdbe94068e61"} Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.338677 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" event={"ID":"01d732e4-c4d2-4b34-8335-8f5f9b2299cd","Type":"ContainerStarted","Data":"52dca552ee976380882a157c47b6641b35b2d88b6bf50fab147ea30580c88a13"} Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.340289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" event={"ID":"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0","Type":"ContainerStarted","Data":"02e99d3dbab8ab3b92e93087e1310afe51a1004d9679637b57c8d9c131da37eb"} Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.341486 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" event={"ID":"d5bad799-6929-4d8a-ab6e-7463d787e8e0","Type":"ContainerStarted","Data":"69c275a1f284c21318a448aaa4b0e07a2748f4e531f6f7a5f609fc2855f2ddf7"} Mar 14 05:45:39 crc kubenswrapper[4713]: I0314 05:45:39.369655 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-999d8b566-8h8bm" podStartSLOduration=2.369624536 podStartE2EDuration="2.369624536s" podCreationTimestamp="2026-03-14 05:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:45:39.35811477 +0000 UTC m=+1122.446024090" watchObservedRunningTime="2026-03-14 05:45:39.369624536 +0000 UTC m=+1122.457533836" Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.388410 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" event={"ID":"01d732e4-c4d2-4b34-8335-8f5f9b2299cd","Type":"ContainerStarted","Data":"92c35a242156c9a4969b8f5063c59d70d07a12d3f5692cd848ff1771d946cc89"} Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.389916 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" event={"ID":"a3c3dff8-a2ea-4073-a6ca-c391aaf296d0","Type":"ContainerStarted","Data":"6bf6c08dd3d82ae4b0713217e313211e4a7255a11576439f7993a07488418cba"} Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.390002 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.391840 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" event={"ID":"d5bad799-6929-4d8a-ab6e-7463d787e8e0","Type":"ContainerStarted","Data":"b702282806e700f49e9398c64326e9da6b3db9fb954e368dd64a84364cf4b4dc"} Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.395040 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cdmmg" event={"ID":"0eb6e7a3-da24-4bd6-8850-db445903fc2a","Type":"ContainerStarted","Data":"a61db22930bfffa8f15c4c950b170801187051c65f6eb69402b224a5362df0a6"} Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.395285 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.409944 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" podStartSLOduration=2.684138978 podStartE2EDuration="7.409858148s" podCreationTimestamp="2026-03-14 05:45:37 +0000 UTC" firstStartedPulling="2026-03-14 05:45:38.453691768 +0000 UTC m=+1121.541601068" lastFinishedPulling="2026-03-14 05:45:43.179410938 +0000 UTC m=+1126.267320238" observedRunningTime="2026-03-14 05:45:44.409364922 +0000 UTC m=+1127.497274222" watchObservedRunningTime="2026-03-14 05:45:44.409858148 +0000 UTC m=+1127.497767458" Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.447639 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-cdmmg" podStartSLOduration=2.150203751 podStartE2EDuration="7.44760861s" podCreationTimestamp="2026-03-14 05:45:37 +0000 UTC" firstStartedPulling="2026-03-14 05:45:37.919523993 +0000 UTC m=+1121.007433293" lastFinishedPulling="2026-03-14 05:45:43.216928862 +0000 UTC m=+1126.304838152" observedRunningTime="2026-03-14 05:45:44.439325077 +0000 UTC m=+1127.527234377" watchObservedRunningTime="2026-03-14 05:45:44.44760861 +0000 UTC m=+1127.535517910" Mar 14 05:45:44 crc kubenswrapper[4713]: I0314 05:45:44.467465 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bq9hp" podStartSLOduration=2.9005171069999998 podStartE2EDuration="7.467440342s" podCreationTimestamp="2026-03-14 05:45:37 +0000 UTC" firstStartedPulling="2026-03-14 05:45:38.601869445 +0000 UTC m=+1121.689778745" lastFinishedPulling="2026-03-14 05:45:43.16879268 +0000 UTC m=+1126.256701980" observedRunningTime="2026-03-14 05:45:44.455308575 +0000 UTC m=+1127.543217875" watchObservedRunningTime="2026-03-14 05:45:44.467440342 +0000 UTC m=+1127.555349642" Mar 14 05:45:46 crc kubenswrapper[4713]: I0314 05:45:46.413300 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" event={"ID":"01d732e4-c4d2-4b34-8335-8f5f9b2299cd","Type":"ContainerStarted","Data":"7fc77536cb25701c93753c0f8c8b96ad7bd13a6d9c11ce3e1d92ead0755742f0"} Mar 14 05:45:46 crc kubenswrapper[4713]: I0314 05:45:46.444803 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xscpb" podStartSLOduration=1.841289786 podStartE2EDuration="9.444764448s" podCreationTimestamp="2026-03-14 05:45:37 +0000 UTC" firstStartedPulling="2026-03-14 05:45:38.372371679 +0000 UTC m=+1121.460280979" lastFinishedPulling="2026-03-14 05:45:45.975846341 +0000 UTC m=+1129.063755641" observedRunningTime="2026-03-14 05:45:46.441676259 +0000 UTC m=+1129.529585559" watchObservedRunningTime="2026-03-14 05:45:46.444764448 +0000 UTC m=+1129.532673758" Mar 14 05:45:48 crc kubenswrapper[4713]: I0314 05:45:48.351845 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:48 crc kubenswrapper[4713]: I0314 05:45:48.352394 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:48 crc kubenswrapper[4713]: I0314 05:45:48.357479 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:48 crc kubenswrapper[4713]: I0314 05:45:48.440914 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:45:48 crc kubenswrapper[4713]: I0314 05:45:48.504733 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:45:52 crc kubenswrapper[4713]: I0314 05:45:52.852475 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-cdmmg" Mar 14 05:45:57 crc kubenswrapper[4713]: I0314 05:45:57.808885 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.138678 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557786-qtpnn"] Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.140139 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.142613 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.143443 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.143652 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.146230 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557786-qtpnn"] Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.202297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bblbf\" (UniqueName: \"kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf\") pod \"auto-csr-approver-29557786-qtpnn\" (UID: \"77944775-7eab-4ede-8a7d-31a489a29ae3\") " pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.304499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bblbf\" (UniqueName: \"kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf\") pod \"auto-csr-approver-29557786-qtpnn\" (UID: \"77944775-7eab-4ede-8a7d-31a489a29ae3\") " pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.326265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bblbf\" (UniqueName: \"kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf\") pod \"auto-csr-approver-29557786-qtpnn\" (UID: \"77944775-7eab-4ede-8a7d-31a489a29ae3\") " pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.462886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:00 crc kubenswrapper[4713]: I0314 05:46:00.888856 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557786-qtpnn"] Mar 14 05:46:00 crc kubenswrapper[4713]: W0314 05:46:00.900824 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77944775_7eab_4ede_8a7d_31a489a29ae3.slice/crio-cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae WatchSource:0}: Error finding container cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae: Status 404 returned error can't find the container with id cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae Mar 14 05:46:01 crc kubenswrapper[4713]: I0314 05:46:01.770788 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" event={"ID":"77944775-7eab-4ede-8a7d-31a489a29ae3","Type":"ContainerStarted","Data":"cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae"} Mar 14 05:46:02 crc kubenswrapper[4713]: I0314 05:46:02.780333 4713 generic.go:334] "Generic (PLEG): container finished" podID="77944775-7eab-4ede-8a7d-31a489a29ae3" containerID="05f9987dc802058b1c3afcda5115b37ce74c083a3ec3bed5fcb50c7473f8a311" exitCode=0 Mar 14 05:46:02 crc kubenswrapper[4713]: I0314 05:46:02.781694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" event={"ID":"77944775-7eab-4ede-8a7d-31a489a29ae3","Type":"ContainerDied","Data":"05f9987dc802058b1c3afcda5115b37ce74c083a3ec3bed5fcb50c7473f8a311"} Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.092992 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.180919 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bblbf\" (UniqueName: \"kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf\") pod \"77944775-7eab-4ede-8a7d-31a489a29ae3\" (UID: \"77944775-7eab-4ede-8a7d-31a489a29ae3\") " Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.189202 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf" (OuterVolumeSpecName: "kube-api-access-bblbf") pod "77944775-7eab-4ede-8a7d-31a489a29ae3" (UID: "77944775-7eab-4ede-8a7d-31a489a29ae3"). InnerVolumeSpecName "kube-api-access-bblbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.285127 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bblbf\" (UniqueName: \"kubernetes.io/projected/77944775-7eab-4ede-8a7d-31a489a29ae3-kube-api-access-bblbf\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.801150 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" event={"ID":"77944775-7eab-4ede-8a7d-31a489a29ae3","Type":"ContainerDied","Data":"cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae"} Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.801225 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf0802728b2d195cc928acfcb40a0f6f563a6af0b332f3d9ecc3d61fa647bae" Mar 14 05:46:04 crc kubenswrapper[4713]: I0314 05:46:04.801261 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557786-qtpnn" Mar 14 05:46:05 crc kubenswrapper[4713]: I0314 05:46:05.154627 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557780-d2clc"] Mar 14 05:46:05 crc kubenswrapper[4713]: I0314 05:46:05.160295 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557780-d2clc"] Mar 14 05:46:05 crc kubenswrapper[4713]: I0314 05:46:05.576022 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356" path="/var/lib/kubelet/pods/5bcbe2c7-a3ff-4380-a7d0-f80d2c08f356/volumes" Mar 14 05:46:10 crc kubenswrapper[4713]: I0314 05:46:10.925955 4713 scope.go:117] "RemoveContainer" containerID="1c1baa012cdbfb727466711887cec7f8e032ffc4fadb8a42bf3af7953ad0ef34" Mar 14 05:46:13 crc kubenswrapper[4713]: I0314 05:46:13.560900 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-764cb777d4-qh4kb" podUID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" containerName="console" containerID="cri-o://b7c3030a8d99f0fa0ce902c3ef3ab3b397456026b6cd10a0e7cfc1344cade9ce" gracePeriod=15 Mar 14 05:46:13 crc kubenswrapper[4713]: I0314 05:46:13.892999 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764cb777d4-qh4kb_30662ccb-2b29-409a-8ccc-6e68c5d7435a/console/0.log" Mar 14 05:46:13 crc kubenswrapper[4713]: I0314 05:46:13.893420 4713 generic.go:334] "Generic (PLEG): container finished" podID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" containerID="b7c3030a8d99f0fa0ce902c3ef3ab3b397456026b6cd10a0e7cfc1344cade9ce" exitCode=2 Mar 14 05:46:13 crc kubenswrapper[4713]: I0314 05:46:13.893458 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764cb777d4-qh4kb" event={"ID":"30662ccb-2b29-409a-8ccc-6e68c5d7435a","Type":"ContainerDied","Data":"b7c3030a8d99f0fa0ce902c3ef3ab3b397456026b6cd10a0e7cfc1344cade9ce"} Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.013696 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764cb777d4-qh4kb_30662ccb-2b29-409a-8ccc-6e68c5d7435a/console/0.log" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.013760 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.202912 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203026 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203095 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhkvf\" (UniqueName: \"kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203175 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.203216 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca\") pod \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\" (UID: \"30662ccb-2b29-409a-8ccc-6e68c5d7435a\") " Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.204007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca" (OuterVolumeSpecName: "service-ca") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.204021 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.204342 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config" (OuterVolumeSpecName: "console-config") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.204571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.209012 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.209395 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.210681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf" (OuterVolumeSpecName: "kube-api-access-mhkvf") pod "30662ccb-2b29-409a-8ccc-6e68c5d7435a" (UID: "30662ccb-2b29-409a-8ccc-6e68c5d7435a"). InnerVolumeSpecName "kube-api-access-mhkvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304474 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304513 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhkvf\" (UniqueName: \"kubernetes.io/projected/30662ccb-2b29-409a-8ccc-6e68c5d7435a-kube-api-access-mhkvf\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304522 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304531 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304578 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304591 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30662ccb-2b29-409a-8ccc-6e68c5d7435a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.304600 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30662ccb-2b29-409a-8ccc-6e68c5d7435a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.902513 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764cb777d4-qh4kb_30662ccb-2b29-409a-8ccc-6e68c5d7435a/console/0.log" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.903776 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764cb777d4-qh4kb" event={"ID":"30662ccb-2b29-409a-8ccc-6e68c5d7435a","Type":"ContainerDied","Data":"8de2213c744755c6a63c002e363799ad0674ebf98a2fc2084b943c826b04e24c"} Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.903836 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764cb777d4-qh4kb" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.903913 4713 scope.go:117] "RemoveContainer" containerID="b7c3030a8d99f0fa0ce902c3ef3ab3b397456026b6cd10a0e7cfc1344cade9ce" Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.941372 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:46:14 crc kubenswrapper[4713]: I0314 05:46:14.947395 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-764cb777d4-qh4kb"] Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.574185 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" path="/var/lib/kubelet/pods/30662ccb-2b29-409a-8ccc-6e68c5d7435a/volumes" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.974357 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv"] Mar 14 05:46:15 crc kubenswrapper[4713]: E0314 05:46:15.974692 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" containerName="console" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.974707 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" containerName="console" Mar 14 05:46:15 crc kubenswrapper[4713]: E0314 05:46:15.974733 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77944775-7eab-4ede-8a7d-31a489a29ae3" containerName="oc" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.974739 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="77944775-7eab-4ede-8a7d-31a489a29ae3" containerName="oc" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.974884 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="30662ccb-2b29-409a-8ccc-6e68c5d7435a" containerName="console" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.974901 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="77944775-7eab-4ede-8a7d-31a489a29ae3" containerName="oc" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.981028 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:15 crc kubenswrapper[4713]: I0314 05:46:15.984733 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.005748 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv"] Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.129546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.129924 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxrkv\" (UniqueName: \"kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.129966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.231378 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxrkv\" (UniqueName: \"kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.231486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.231611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.232142 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.232281 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.254800 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxrkv\" (UniqueName: \"kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.299886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.734694 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv"] Mar 14 05:46:16 crc kubenswrapper[4713]: I0314 05:46:16.921834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" event={"ID":"2c3f03b0-9008-47a3-8fbe-7d4366757e02","Type":"ContainerStarted","Data":"fe8a72bff8aa04a77d2305dca0ceebac13e4b9a5b7deeeec2fcee5b1625aa38a"} Mar 14 05:46:17 crc kubenswrapper[4713]: I0314 05:46:17.935796 4713 generic.go:334] "Generic (PLEG): container finished" podID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerID="e5d618c97803b81a0bd24cc0b1abb823406bd197f16c5d04cb3bf6d4c71f0380" exitCode=0 Mar 14 05:46:17 crc kubenswrapper[4713]: I0314 05:46:17.935851 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" event={"ID":"2c3f03b0-9008-47a3-8fbe-7d4366757e02","Type":"ContainerDied","Data":"e5d618c97803b81a0bd24cc0b1abb823406bd197f16c5d04cb3bf6d4c71f0380"} Mar 14 05:46:17 crc kubenswrapper[4713]: I0314 05:46:17.938510 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:46:21 crc kubenswrapper[4713]: I0314 05:46:21.964102 4713 generic.go:334] "Generic (PLEG): container finished" podID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerID="76197d3557625cfe7a44ff7f12b2c86b40e41da6e5c3598a93284d38fac0f743" exitCode=0 Mar 14 05:46:21 crc kubenswrapper[4713]: I0314 05:46:21.964167 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" event={"ID":"2c3f03b0-9008-47a3-8fbe-7d4366757e02","Type":"ContainerDied","Data":"76197d3557625cfe7a44ff7f12b2c86b40e41da6e5c3598a93284d38fac0f743"} Mar 14 05:46:23 crc kubenswrapper[4713]: I0314 05:46:23.982301 4713 generic.go:334] "Generic (PLEG): container finished" podID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerID="6a3da8112fed9d4194a290b7db0f8ee744493c40896fe09cfa226ed077004eb0" exitCode=0 Mar 14 05:46:23 crc kubenswrapper[4713]: I0314 05:46:23.982398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" event={"ID":"2c3f03b0-9008-47a3-8fbe-7d4366757e02","Type":"ContainerDied","Data":"6a3da8112fed9d4194a290b7db0f8ee744493c40896fe09cfa226ed077004eb0"} Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.285995 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.376059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxrkv\" (UniqueName: \"kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv\") pod \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.376477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle\") pod \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.376519 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util\") pod \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\" (UID: \"2c3f03b0-9008-47a3-8fbe-7d4366757e02\") " Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.377437 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle" (OuterVolumeSpecName: "bundle") pod "2c3f03b0-9008-47a3-8fbe-7d4366757e02" (UID: "2c3f03b0-9008-47a3-8fbe-7d4366757e02"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.382989 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv" (OuterVolumeSpecName: "kube-api-access-nxrkv") pod "2c3f03b0-9008-47a3-8fbe-7d4366757e02" (UID: "2c3f03b0-9008-47a3-8fbe-7d4366757e02"). InnerVolumeSpecName "kube-api-access-nxrkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.389005 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util" (OuterVolumeSpecName: "util") pod "2c3f03b0-9008-47a3-8fbe-7d4366757e02" (UID: "2c3f03b0-9008-47a3-8fbe-7d4366757e02"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.477933 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.478403 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3f03b0-9008-47a3-8fbe-7d4366757e02-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:25 crc kubenswrapper[4713]: I0314 05:46:25.478509 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxrkv\" (UniqueName: \"kubernetes.io/projected/2c3f03b0-9008-47a3-8fbe-7d4366757e02-kube-api-access-nxrkv\") on node \"crc\" DevicePath \"\"" Mar 14 05:46:26 crc kubenswrapper[4713]: I0314 05:46:26.001126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" event={"ID":"2c3f03b0-9008-47a3-8fbe-7d4366757e02","Type":"ContainerDied","Data":"fe8a72bff8aa04a77d2305dca0ceebac13e4b9a5b7deeeec2fcee5b1625aa38a"} Mar 14 05:46:26 crc kubenswrapper[4713]: I0314 05:46:26.001195 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe8a72bff8aa04a77d2305dca0ceebac13e4b9a5b7deeeec2fcee5b1625aa38a" Mar 14 05:46:26 crc kubenswrapper[4713]: I0314 05:46:26.001238 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.928866 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp"] Mar 14 05:46:31 crc kubenswrapper[4713]: E0314 05:46:31.929764 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="pull" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.929782 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="pull" Mar 14 05:46:31 crc kubenswrapper[4713]: E0314 05:46:31.929800 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="extract" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.929808 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="extract" Mar 14 05:46:31 crc kubenswrapper[4713]: E0314 05:46:31.929841 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="util" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.929850 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="util" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.930012 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3f03b0-9008-47a3-8fbe-7d4366757e02" containerName="extract" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.930647 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.932805 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xltfn" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.932821 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.935291 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.935439 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.935602 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 14 05:46:31 crc kubenswrapper[4713]: I0314 05:46:31.949487 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp"] Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.100471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-apiservice-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.100522 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6wm7\" (UniqueName: \"kubernetes.io/projected/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-kube-api-access-f6wm7\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.100608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-webhook-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.201641 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-webhook-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.201788 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-apiservice-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.201813 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wm7\" (UniqueName: \"kubernetes.io/projected/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-kube-api-access-f6wm7\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.206882 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-apiservice-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.216627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-webhook-cert\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.225977 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6wm7\" (UniqueName: \"kubernetes.io/projected/6c7267d1-1d86-4ea3-91c6-5edc53bdfe01-kube-api-access-f6wm7\") pod \"metallb-operator-controller-manager-84b689b795-q7lfp\" (UID: \"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01\") " pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.254805 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw"] Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.256685 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.260537 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.260539 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.261950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bzd65" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.279722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.291286 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw"] Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.404608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-webhook-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.404670 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-apiservice-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.404805 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g2h9\" (UniqueName: \"kubernetes.io/projected/8440ca7f-e5cd-4deb-9e52-8be733b65583-kube-api-access-4g2h9\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.506108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-webhook-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.506489 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-apiservice-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.506554 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g2h9\" (UniqueName: \"kubernetes.io/projected/8440ca7f-e5cd-4deb-9e52-8be733b65583-kube-api-access-4g2h9\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.512969 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-apiservice-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.515273 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8440ca7f-e5cd-4deb-9e52-8be733b65583-webhook-cert\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.527939 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g2h9\" (UniqueName: \"kubernetes.io/projected/8440ca7f-e5cd-4deb-9e52-8be733b65583-kube-api-access-4g2h9\") pod \"metallb-operator-webhook-server-5b665cf668-zl2jw\" (UID: \"8440ca7f-e5cd-4deb-9e52-8be733b65583\") " pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.595186 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:32 crc kubenswrapper[4713]: I0314 05:46:32.867253 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp"] Mar 14 05:46:33 crc kubenswrapper[4713]: I0314 05:46:33.056965 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw"] Mar 14 05:46:33 crc kubenswrapper[4713]: I0314 05:46:33.063383 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" event={"ID":"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01","Type":"ContainerStarted","Data":"76efadd6f1b40b2c47368c95f56b8b49ff9a6054e3335b927ff0f196c19a0423"} Mar 14 05:46:33 crc kubenswrapper[4713]: W0314 05:46:33.078600 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8440ca7f_e5cd_4deb_9e52_8be733b65583.slice/crio-0d662fb576fdecef80cf98526d2e9e74419b214a76d923cf58de116557f39cfb WatchSource:0}: Error finding container 0d662fb576fdecef80cf98526d2e9e74419b214a76d923cf58de116557f39cfb: Status 404 returned error can't find the container with id 0d662fb576fdecef80cf98526d2e9e74419b214a76d923cf58de116557f39cfb Mar 14 05:46:34 crc kubenswrapper[4713]: I0314 05:46:34.070693 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" event={"ID":"8440ca7f-e5cd-4deb-9e52-8be733b65583","Type":"ContainerStarted","Data":"0d662fb576fdecef80cf98526d2e9e74419b214a76d923cf58de116557f39cfb"} Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.114305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" event={"ID":"8440ca7f-e5cd-4deb-9e52-8be733b65583","Type":"ContainerStarted","Data":"24315edf7f9c7831495988b0093625f781a9e53320b097ca2701ae04d824b0de"} Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.114842 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.116354 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" event={"ID":"6c7267d1-1d86-4ea3-91c6-5edc53bdfe01","Type":"ContainerStarted","Data":"a945a1b755ec796da8f6f0795e0deca9fa2be4b214bc70a00e4577a428f831e9"} Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.116486 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.139953 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podStartSLOduration=2.027418012 podStartE2EDuration="7.139934325s" podCreationTimestamp="2026-03-14 05:46:32 +0000 UTC" firstStartedPulling="2026-03-14 05:46:33.082261579 +0000 UTC m=+1176.170170879" lastFinishedPulling="2026-03-14 05:46:38.194777872 +0000 UTC m=+1181.282687192" observedRunningTime="2026-03-14 05:46:39.138125657 +0000 UTC m=+1182.226034957" watchObservedRunningTime="2026-03-14 05:46:39.139934325 +0000 UTC m=+1182.227843625" Mar 14 05:46:39 crc kubenswrapper[4713]: I0314 05:46:39.173726 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" podStartSLOduration=2.870676207 podStartE2EDuration="8.173700808s" podCreationTimestamp="2026-03-14 05:46:31 +0000 UTC" firstStartedPulling="2026-03-14 05:46:32.870508121 +0000 UTC m=+1175.958417421" lastFinishedPulling="2026-03-14 05:46:38.173532722 +0000 UTC m=+1181.261442022" observedRunningTime="2026-03-14 05:46:39.168173941 +0000 UTC m=+1182.256083241" watchObservedRunningTime="2026-03-14 05:46:39.173700808 +0000 UTC m=+1182.261610108" Mar 14 05:46:52 crc kubenswrapper[4713]: I0314 05:46:52.602113 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.283340 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.959822 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6"] Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.961334 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.963118 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cnz9d" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.963905 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.966141 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-64h8p"] Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.969948 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.972473 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.981701 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 14 05:47:12 crc kubenswrapper[4713]: I0314 05:47:12.987106 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6"] Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.081369 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cl7ll"] Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.083057 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.086319 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.088789 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-knjfw"] Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.090623 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091136 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vqhqc" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091237 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091509 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091829 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-conf\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091903 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metrics-certs\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091925 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091945 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics-certs\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091975 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4353213b-b89f-4288-babb-7afef0ca216a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.091994 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c489d\" (UniqueName: \"kubernetes.io/projected/4353213b-b89f-4288-babb-7afef0ca216a-kube-api-access-c489d\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9c2\" (UniqueName: \"kubernetes.io/projected/fb9f27cd-ac40-407e-b9a5-f9594122604f-kube-api-access-dl9c2\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092047 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-reloader\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092081 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r4m\" (UniqueName: \"kubernetes.io/projected/a5cbbe27-0738-4819-a4bc-5bc7d2945248-kube-api-access-s5r4m\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-sockets\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metallb-excludel2\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.092408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-startup\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.095616 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-knjfw"] Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.095996 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194232 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-conf\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194286 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metrics-certs\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194337 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics-certs\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194368 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4353213b-b89f-4288-babb-7afef0ca216a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c489d\" (UniqueName: \"kubernetes.io/projected/4353213b-b89f-4288-babb-7afef0ca216a-kube-api-access-c489d\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194407 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8zh\" (UniqueName: \"kubernetes.io/projected/9640b5fe-f2ba-4a12-b456-1643ddc063f2-kube-api-access-gq8zh\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194424 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9c2\" (UniqueName: \"kubernetes.io/projected/fb9f27cd-ac40-407e-b9a5-f9594122604f-kube-api-access-dl9c2\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194457 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-reloader\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-metrics-certs\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194508 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r4m\" (UniqueName: \"kubernetes.io/projected/a5cbbe27-0738-4819-a4bc-5bc7d2945248-kube-api-access-s5r4m\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194527 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-sockets\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194542 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metallb-excludel2\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-startup\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-cert\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.194614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.195015 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.195217 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-reloader\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.195634 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-conf\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.195837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-sockets\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: E0314 05:47:13.196357 4713 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 05:47:13 crc kubenswrapper[4713]: E0314 05:47:13.196441 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist podName:a5cbbe27-0738-4819-a4bc-5bc7d2945248 nodeName:}" failed. No retries permitted until 2026-03-14 05:47:13.696422388 +0000 UTC m=+1216.784331688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist") pod "speaker-cl7ll" (UID: "a5cbbe27-0738-4819-a4bc-5bc7d2945248") : secret "metallb-memberlist" not found Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.196480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metallb-excludel2\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.197063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb9f27cd-ac40-407e-b9a5-f9594122604f-frr-startup\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.205078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb9f27cd-ac40-407e-b9a5-f9594122604f-metrics-certs\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.215710 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-metrics-certs\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.228232 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4353213b-b89f-4288-babb-7afef0ca216a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.230011 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9c2\" (UniqueName: \"kubernetes.io/projected/fb9f27cd-ac40-407e-b9a5-f9594122604f-kube-api-access-dl9c2\") pod \"frr-k8s-64h8p\" (UID: \"fb9f27cd-ac40-407e-b9a5-f9594122604f\") " pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.236029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c489d\" (UniqueName: \"kubernetes.io/projected/4353213b-b89f-4288-babb-7afef0ca216a-kube-api-access-c489d\") pod \"frr-k8s-webhook-server-bcc4b6f68-84hc6\" (UID: \"4353213b-b89f-4288-babb-7afef0ca216a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.246869 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r4m\" (UniqueName: \"kubernetes.io/projected/a5cbbe27-0738-4819-a4bc-5bc7d2945248-kube-api-access-s5r4m\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.296095 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8zh\" (UniqueName: \"kubernetes.io/projected/9640b5fe-f2ba-4a12-b456-1643ddc063f2-kube-api-access-gq8zh\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.296188 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-metrics-certs\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.296258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-cert\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.296980 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.302696 4713 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.303511 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-metrics-certs\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.312074 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9640b5fe-f2ba-4a12-b456-1643ddc063f2-cert\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.318823 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8zh\" (UniqueName: \"kubernetes.io/projected/9640b5fe-f2ba-4a12-b456-1643ddc063f2-kube-api-access-gq8zh\") pod \"controller-7bb4cc7c98-knjfw\" (UID: \"9640b5fe-f2ba-4a12-b456-1643ddc063f2\") " pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.319193 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.420401 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.701604 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:13 crc kubenswrapper[4713]: E0314 05:47:13.701753 4713 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 05:47:13 crc kubenswrapper[4713]: E0314 05:47:13.702080 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist podName:a5cbbe27-0738-4819-a4bc-5bc7d2945248 nodeName:}" failed. No retries permitted until 2026-03-14 05:47:14.702062144 +0000 UTC m=+1217.789971444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist") pod "speaker-cl7ll" (UID: "a5cbbe27-0738-4819-a4bc-5bc7d2945248") : secret "metallb-memberlist" not found Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.825707 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6"] Mar 14 05:47:13 crc kubenswrapper[4713]: W0314 05:47:13.827563 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4353213b_b89f_4288_babb_7afef0ca216a.slice/crio-5bb01310bdab919f223bbc36039f400e69fade2664e2cf7942ea34da8daa55ce WatchSource:0}: Error finding container 5bb01310bdab919f223bbc36039f400e69fade2664e2cf7942ea34da8daa55ce: Status 404 returned error can't find the container with id 5bb01310bdab919f223bbc36039f400e69fade2664e2cf7942ea34da8daa55ce Mar 14 05:47:13 crc kubenswrapper[4713]: I0314 05:47:13.920176 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-knjfw"] Mar 14 05:47:13 crc kubenswrapper[4713]: W0314 05:47:13.922889 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9640b5fe_f2ba_4a12_b456_1643ddc063f2.slice/crio-35351270caf4b2b63f0e040f08c839d03598da856b239472dad61a8a26b82996 WatchSource:0}: Error finding container 35351270caf4b2b63f0e040f08c839d03598da856b239472dad61a8a26b82996: Status 404 returned error can't find the container with id 35351270caf4b2b63f0e040f08c839d03598da856b239472dad61a8a26b82996 Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.407727 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"5e7a423cdacaf7dc975aafd13ee18a417a433789ffba063105055c198e1768d9"} Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.410120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" event={"ID":"4353213b-b89f-4288-babb-7afef0ca216a","Type":"ContainerStarted","Data":"5bb01310bdab919f223bbc36039f400e69fade2664e2cf7942ea34da8daa55ce"} Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.412231 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-knjfw" event={"ID":"9640b5fe-f2ba-4a12-b456-1643ddc063f2","Type":"ContainerStarted","Data":"518aa441c71340e82cb8b2312962ffdd9ffcbab7f115f3eaca411474f0d6baeb"} Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.412374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-knjfw" event={"ID":"9640b5fe-f2ba-4a12-b456-1643ddc063f2","Type":"ContainerStarted","Data":"22ffcce850850e8695a1a6ea896c0642722a70356795c61a23e23cb39f33b6aa"} Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.412390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-knjfw" event={"ID":"9640b5fe-f2ba-4a12-b456-1643ddc063f2","Type":"ContainerStarted","Data":"35351270caf4b2b63f0e040f08c839d03598da856b239472dad61a8a26b82996"} Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.412476 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.440048 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-knjfw" podStartSLOduration=1.440009345 podStartE2EDuration="1.440009345s" podCreationTimestamp="2026-03-14 05:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:47:14.430582303 +0000 UTC m=+1217.518491603" watchObservedRunningTime="2026-03-14 05:47:14.440009345 +0000 UTC m=+1217.527918645" Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.721668 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.733347 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5cbbe27-0738-4819-a4bc-5bc7d2945248-memberlist\") pod \"speaker-cl7ll\" (UID: \"a5cbbe27-0738-4819-a4bc-5bc7d2945248\") " pod="metallb-system/speaker-cl7ll" Mar 14 05:47:14 crc kubenswrapper[4713]: I0314 05:47:14.907343 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cl7ll" Mar 14 05:47:15 crc kubenswrapper[4713]: I0314 05:47:15.440034 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl7ll" event={"ID":"a5cbbe27-0738-4819-a4bc-5bc7d2945248","Type":"ContainerStarted","Data":"694a853f948b9bae25fe94d1624a74ab29513f665e8594f47e6122ab5e9e9525"} Mar 14 05:47:15 crc kubenswrapper[4713]: I0314 05:47:15.440375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl7ll" event={"ID":"a5cbbe27-0738-4819-a4bc-5bc7d2945248","Type":"ContainerStarted","Data":"b12a9971b1aa999ff8425233ecb4f6da4cd93a07c96480f651a7352f19d25a3a"} Mar 14 05:47:16 crc kubenswrapper[4713]: I0314 05:47:16.475967 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl7ll" event={"ID":"a5cbbe27-0738-4819-a4bc-5bc7d2945248","Type":"ContainerStarted","Data":"aa58e74e378e38886151c200a72968f35b3c347d5690a88fb598489ea18db772"} Mar 14 05:47:16 crc kubenswrapper[4713]: I0314 05:47:16.477168 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cl7ll" Mar 14 05:47:16 crc kubenswrapper[4713]: I0314 05:47:16.502833 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cl7ll" podStartSLOduration=3.502804278 podStartE2EDuration="3.502804278s" podCreationTimestamp="2026-03-14 05:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:47:16.502601151 +0000 UTC m=+1219.590510451" watchObservedRunningTime="2026-03-14 05:47:16.502804278 +0000 UTC m=+1219.590713578" Mar 14 05:47:22 crc kubenswrapper[4713]: I0314 05:47:22.532337 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerID="0485426a1bb1d9f537d11351321258f73184df259c3ab8d9ff53d03d6f7729fc" exitCode=0 Mar 14 05:47:22 crc kubenswrapper[4713]: I0314 05:47:22.532774 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerDied","Data":"0485426a1bb1d9f537d11351321258f73184df259c3ab8d9ff53d03d6f7729fc"} Mar 14 05:47:22 crc kubenswrapper[4713]: I0314 05:47:22.536498 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" event={"ID":"4353213b-b89f-4288-babb-7afef0ca216a","Type":"ContainerStarted","Data":"aaf5cad0707989fc94a29b45cd275755b022b2382c7d94e7f417872e3f270f54"} Mar 14 05:47:22 crc kubenswrapper[4713]: I0314 05:47:22.536707 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:23 crc kubenswrapper[4713]: I0314 05:47:23.552138 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerID="b0625d9423a73e62ba6e9ad04e0688a0514b48ce889b09d82cf631c2447cc369" exitCode=0 Mar 14 05:47:23 crc kubenswrapper[4713]: I0314 05:47:23.552340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerDied","Data":"b0625d9423a73e62ba6e9ad04e0688a0514b48ce889b09d82cf631c2447cc369"} Mar 14 05:47:23 crc kubenswrapper[4713]: I0314 05:47:23.611682 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podStartSLOduration=3.983197715 podStartE2EDuration="11.611662248s" podCreationTimestamp="2026-03-14 05:47:12 +0000 UTC" firstStartedPulling="2026-03-14 05:47:13.829424676 +0000 UTC m=+1216.917333976" lastFinishedPulling="2026-03-14 05:47:21.457889209 +0000 UTC m=+1224.545798509" observedRunningTime="2026-03-14 05:47:22.590052075 +0000 UTC m=+1225.677961385" watchObservedRunningTime="2026-03-14 05:47:23.611662248 +0000 UTC m=+1226.699571538" Mar 14 05:47:24 crc kubenswrapper[4713]: I0314 05:47:24.561859 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerID="85158df7ea651831b3fe957039754155897da8e0ba03b1f6a142f6e50ae9c04e" exitCode=0 Mar 14 05:47:24 crc kubenswrapper[4713]: I0314 05:47:24.561919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerDied","Data":"85158df7ea651831b3fe957039754155897da8e0ba03b1f6a142f6e50ae9c04e"} Mar 14 05:47:25 crc kubenswrapper[4713]: I0314 05:47:25.578669 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"54fe0af700455a9105d3989f42205f1e1f67fb4b4b1116c15a93012198cd565d"} Mar 14 05:47:25 crc kubenswrapper[4713]: I0314 05:47:25.579088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"cbdec43a5dcc8c83334a47ab60b26aa7c2dd1defbcb03cfa746cfe22aefd3b6b"} Mar 14 05:47:25 crc kubenswrapper[4713]: I0314 05:47:25.579102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"ae8f3f497a7e1f2784f8929d857ad944545fb9d4a5baaa285c02257b2f6619ad"} Mar 14 05:47:25 crc kubenswrapper[4713]: I0314 05:47:25.579110 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"fa9c779b58a19a8105e4e00fbd558ac3b5fad3f2ebff0bcc9dd6b29f92f3d5d5"} Mar 14 05:47:26 crc kubenswrapper[4713]: I0314 05:47:26.590728 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"cfe9243e5711604a84e9fa3be4eb545a80719a39114a6ea5ce258c7743dff688"} Mar 14 05:47:26 crc kubenswrapper[4713]: I0314 05:47:26.591049 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:26 crc kubenswrapper[4713]: I0314 05:47:26.591066 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"8476cf3bb0a8b5f922139772f10473ed97d486c98de55d71b97800ee8666220f"} Mar 14 05:47:26 crc kubenswrapper[4713]: I0314 05:47:26.611151 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-64h8p" podStartSLOduration=6.70096834 podStartE2EDuration="14.611132001s" podCreationTimestamp="2026-03-14 05:47:12 +0000 UTC" firstStartedPulling="2026-03-14 05:47:13.543379798 +0000 UTC m=+1216.631289098" lastFinishedPulling="2026-03-14 05:47:21.453543459 +0000 UTC m=+1224.541452759" observedRunningTime="2026-03-14 05:47:26.609929783 +0000 UTC m=+1229.697839103" watchObservedRunningTime="2026-03-14 05:47:26.611132001 +0000 UTC m=+1229.699041301" Mar 14 05:47:28 crc kubenswrapper[4713]: I0314 05:47:28.320224 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:28 crc kubenswrapper[4713]: I0314 05:47:28.368279 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:33 crc kubenswrapper[4713]: I0314 05:47:33.303943 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 05:47:33 crc kubenswrapper[4713]: I0314 05:47:33.424502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-knjfw" Mar 14 05:47:34 crc kubenswrapper[4713]: I0314 05:47:34.911079 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cl7ll" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.049079 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.050801 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.054942 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.055291 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-h7947" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.055450 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.072952 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.213913 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g67xv\" (UniqueName: \"kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv\") pod \"openstack-operator-index-j5twq\" (UID: \"cb76543c-a17a-4c09-a065-bc284c133f8a\") " pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.316123 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g67xv\" (UniqueName: \"kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv\") pod \"openstack-operator-index-j5twq\" (UID: \"cb76543c-a17a-4c09-a065-bc284c133f8a\") " pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.334795 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g67xv\" (UniqueName: \"kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv\") pod \"openstack-operator-index-j5twq\" (UID: \"cb76543c-a17a-4c09-a065-bc284c133f8a\") " pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:38 crc kubenswrapper[4713]: I0314 05:47:38.374105 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:39 crc kubenswrapper[4713]: I0314 05:47:39.203136 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:39 crc kubenswrapper[4713]: I0314 05:47:39.685128 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j5twq" event={"ID":"cb76543c-a17a-4c09-a065-bc284c133f8a","Type":"ContainerStarted","Data":"76ee4465f6e5af3533e73c24c5b2f1d40e2fb755bdaa2878ecf4c893157a385a"} Mar 14 05:47:40 crc kubenswrapper[4713]: I0314 05:47:40.731090 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:47:40 crc kubenswrapper[4713]: I0314 05:47:40.731503 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:47:41 crc kubenswrapper[4713]: I0314 05:47:41.426089 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.151848 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-44kkb"] Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.153124 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.167993 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-44kkb"] Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.216637 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqppq\" (UniqueName: \"kubernetes.io/projected/5a437700-77f6-4838-9a7d-89eda8a27afa-kube-api-access-kqppq\") pod \"openstack-operator-index-44kkb\" (UID: \"5a437700-77f6-4838-9a7d-89eda8a27afa\") " pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.318175 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqppq\" (UniqueName: \"kubernetes.io/projected/5a437700-77f6-4838-9a7d-89eda8a27afa-kube-api-access-kqppq\") pod \"openstack-operator-index-44kkb\" (UID: \"5a437700-77f6-4838-9a7d-89eda8a27afa\") " pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.336742 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqppq\" (UniqueName: \"kubernetes.io/projected/5a437700-77f6-4838-9a7d-89eda8a27afa-kube-api-access-kqppq\") pod \"openstack-operator-index-44kkb\" (UID: \"5a437700-77f6-4838-9a7d-89eda8a27afa\") " pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:42 crc kubenswrapper[4713]: I0314 05:47:42.479640 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:43 crc kubenswrapper[4713]: I0314 05:47:43.323316 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-64h8p" Mar 14 05:47:44 crc kubenswrapper[4713]: I0314 05:47:44.220799 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-44kkb"] Mar 14 05:47:44 crc kubenswrapper[4713]: W0314 05:47:44.261755 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a437700_77f6_4838_9a7d_89eda8a27afa.slice/crio-00738ae2430832c6f549d2ad444022cc457ca355080cf30c4d8e9da2b371b4bf WatchSource:0}: Error finding container 00738ae2430832c6f549d2ad444022cc457ca355080cf30c4d8e9da2b371b4bf: Status 404 returned error can't find the container with id 00738ae2430832c6f549d2ad444022cc457ca355080cf30c4d8e9da2b371b4bf Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.128684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-44kkb" event={"ID":"5a437700-77f6-4838-9a7d-89eda8a27afa","Type":"ContainerStarted","Data":"1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766"} Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.129289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-44kkb" event={"ID":"5a437700-77f6-4838-9a7d-89eda8a27afa","Type":"ContainerStarted","Data":"00738ae2430832c6f549d2ad444022cc457ca355080cf30c4d8e9da2b371b4bf"} Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.131015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j5twq" event={"ID":"cb76543c-a17a-4c09-a065-bc284c133f8a","Type":"ContainerStarted","Data":"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2"} Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.131112 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-j5twq" podUID="cb76543c-a17a-4c09-a065-bc284c133f8a" containerName="registry-server" containerID="cri-o://3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2" gracePeriod=2 Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.148147 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-44kkb" podStartSLOduration=3.091015153 podStartE2EDuration="3.148128014s" podCreationTimestamp="2026-03-14 05:47:42 +0000 UTC" firstStartedPulling="2026-03-14 05:47:44.274154523 +0000 UTC m=+1247.362063813" lastFinishedPulling="2026-03-14 05:47:44.331267364 +0000 UTC m=+1247.419176674" observedRunningTime="2026-03-14 05:47:45.143921368 +0000 UTC m=+1248.231830658" watchObservedRunningTime="2026-03-14 05:47:45.148128014 +0000 UTC m=+1248.236037314" Mar 14 05:47:45 crc kubenswrapper[4713]: I0314 05:47:45.169121 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j5twq" podStartSLOduration=2.106981394 podStartE2EDuration="7.169106856s" podCreationTimestamp="2026-03-14 05:47:38 +0000 UTC" firstStartedPulling="2026-03-14 05:47:39.223001812 +0000 UTC m=+1242.310911112" lastFinishedPulling="2026-03-14 05:47:44.285127274 +0000 UTC m=+1247.373036574" observedRunningTime="2026-03-14 05:47:45.166817512 +0000 UTC m=+1248.254726812" watchObservedRunningTime="2026-03-14 05:47:45.169106856 +0000 UTC m=+1248.257016156" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.072304 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.140065 4713 generic.go:334] "Generic (PLEG): container finished" podID="cb76543c-a17a-4c09-a065-bc284c133f8a" containerID="3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2" exitCode=0 Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.140118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j5twq" event={"ID":"cb76543c-a17a-4c09-a065-bc284c133f8a","Type":"ContainerDied","Data":"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2"} Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.140179 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j5twq" event={"ID":"cb76543c-a17a-4c09-a065-bc284c133f8a","Type":"ContainerDied","Data":"76ee4465f6e5af3533e73c24c5b2f1d40e2fb755bdaa2878ecf4c893157a385a"} Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.140146 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j5twq" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.140198 4713 scope.go:117] "RemoveContainer" containerID="3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.159870 4713 scope.go:117] "RemoveContainer" containerID="3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2" Mar 14 05:47:46 crc kubenswrapper[4713]: E0314 05:47:46.160717 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2\": container with ID starting with 3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2 not found: ID does not exist" containerID="3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.160763 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2"} err="failed to get container status \"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2\": rpc error: code = NotFound desc = could not find container \"3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2\": container with ID starting with 3982c30e9437e407f81d09bf440eb3b18b099920653b59f8ef8cc64cd7ade8a2 not found: ID does not exist" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.215881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g67xv\" (UniqueName: \"kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv\") pod \"cb76543c-a17a-4c09-a065-bc284c133f8a\" (UID: \"cb76543c-a17a-4c09-a065-bc284c133f8a\") " Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.222106 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv" (OuterVolumeSpecName: "kube-api-access-g67xv") pod "cb76543c-a17a-4c09-a065-bc284c133f8a" (UID: "cb76543c-a17a-4c09-a065-bc284c133f8a"). InnerVolumeSpecName "kube-api-access-g67xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.318280 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g67xv\" (UniqueName: \"kubernetes.io/projected/cb76543c-a17a-4c09-a065-bc284c133f8a-kube-api-access-g67xv\") on node \"crc\" DevicePath \"\"" Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.470053 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:46 crc kubenswrapper[4713]: I0314 05:47:46.477262 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-j5twq"] Mar 14 05:47:47 crc kubenswrapper[4713]: I0314 05:47:47.573644 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb76543c-a17a-4c09-a065-bc284c133f8a" path="/var/lib/kubelet/pods/cb76543c-a17a-4c09-a065-bc284c133f8a/volumes" Mar 14 05:47:52 crc kubenswrapper[4713]: I0314 05:47:52.480245 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:52 crc kubenswrapper[4713]: I0314 05:47:52.480991 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:52 crc kubenswrapper[4713]: I0314 05:47:52.515892 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:47:53 crc kubenswrapper[4713]: I0314 05:47:53.267616 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.140120 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557788-72k4j"] Mar 14 05:48:00 crc kubenswrapper[4713]: E0314 05:48:00.140933 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb76543c-a17a-4c09-a065-bc284c133f8a" containerName="registry-server" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.140949 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb76543c-a17a-4c09-a065-bc284c133f8a" containerName="registry-server" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.141122 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb76543c-a17a-4c09-a065-bc284c133f8a" containerName="registry-server" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.141805 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.147803 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557788-72k4j"] Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.148848 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.149039 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.149104 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.301271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznmm\" (UniqueName: \"kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm\") pod \"auto-csr-approver-29557788-72k4j\" (UID: \"046a191f-1297-43e4-ad80-9cfdad08202b\") " pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.403667 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sznmm\" (UniqueName: \"kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm\") pod \"auto-csr-approver-29557788-72k4j\" (UID: \"046a191f-1297-43e4-ad80-9cfdad08202b\") " pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.426272 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznmm\" (UniqueName: \"kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm\") pod \"auto-csr-approver-29557788-72k4j\" (UID: \"046a191f-1297-43e4-ad80-9cfdad08202b\") " pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.457587 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:00 crc kubenswrapper[4713]: W0314 05:48:00.971221 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod046a191f_1297_43e4_ad80_9cfdad08202b.slice/crio-7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63 WatchSource:0}: Error finding container 7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63: Status 404 returned error can't find the container with id 7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63 Mar 14 05:48:00 crc kubenswrapper[4713]: I0314 05:48:00.971473 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557788-72k4j"] Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.322222 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6"] Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.324545 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557788-72k4j" event={"ID":"046a191f-1297-43e4-ad80-9cfdad08202b","Type":"ContainerStarted","Data":"7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63"} Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.324674 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.329340 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bd8c6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.334567 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6"] Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.422664 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.422959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnsxk\" (UniqueName: \"kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.423091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.525078 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.525193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnsxk\" (UniqueName: \"kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.525270 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.525763 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.525782 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.544508 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnsxk\" (UniqueName: \"kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:01 crc kubenswrapper[4713]: I0314 05:48:01.645868 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:02 crc kubenswrapper[4713]: I0314 05:48:02.123826 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6"] Mar 14 05:48:02 crc kubenswrapper[4713]: W0314 05:48:02.134220 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90fc82f6_b9df_45d9_bdc7_6eae42f17b64.slice/crio-a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23 WatchSource:0}: Error finding container a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23: Status 404 returned error can't find the container with id a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23 Mar 14 05:48:02 crc kubenswrapper[4713]: I0314 05:48:02.332974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerStarted","Data":"892c0caa175a1e7182afbd59fc4144d497c50676075a4fc54228fb3bd8f84fa9"} Mar 14 05:48:02 crc kubenswrapper[4713]: I0314 05:48:02.333289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerStarted","Data":"a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23"} Mar 14 05:48:03 crc kubenswrapper[4713]: I0314 05:48:03.340981 4713 generic.go:334] "Generic (PLEG): container finished" podID="046a191f-1297-43e4-ad80-9cfdad08202b" containerID="fe4980687f600d8db3607ee143d285ff117a4fb69855303a300c79f037e46b27" exitCode=0 Mar 14 05:48:03 crc kubenswrapper[4713]: I0314 05:48:03.341063 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557788-72k4j" event={"ID":"046a191f-1297-43e4-ad80-9cfdad08202b","Type":"ContainerDied","Data":"fe4980687f600d8db3607ee143d285ff117a4fb69855303a300c79f037e46b27"} Mar 14 05:48:03 crc kubenswrapper[4713]: I0314 05:48:03.343508 4713 generic.go:334] "Generic (PLEG): container finished" podID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerID="892c0caa175a1e7182afbd59fc4144d497c50676075a4fc54228fb3bd8f84fa9" exitCode=0 Mar 14 05:48:03 crc kubenswrapper[4713]: I0314 05:48:03.343597 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerDied","Data":"892c0caa175a1e7182afbd59fc4144d497c50676075a4fc54228fb3bd8f84fa9"} Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.352346 4713 generic.go:334] "Generic (PLEG): container finished" podID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerID="76a1748685c6a70f2e248a3e405200df6859c061cdc882e5c94542dab2f0ec0c" exitCode=0 Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.352466 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerDied","Data":"76a1748685c6a70f2e248a3e405200df6859c061cdc882e5c94542dab2f0ec0c"} Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.728618 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.895187 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sznmm\" (UniqueName: \"kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm\") pod \"046a191f-1297-43e4-ad80-9cfdad08202b\" (UID: \"046a191f-1297-43e4-ad80-9cfdad08202b\") " Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.901568 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm" (OuterVolumeSpecName: "kube-api-access-sznmm") pod "046a191f-1297-43e4-ad80-9cfdad08202b" (UID: "046a191f-1297-43e4-ad80-9cfdad08202b"). InnerVolumeSpecName "kube-api-access-sznmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:48:04 crc kubenswrapper[4713]: I0314 05:48:04.997602 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sznmm\" (UniqueName: \"kubernetes.io/projected/046a191f-1297-43e4-ad80-9cfdad08202b-kube-api-access-sznmm\") on node \"crc\" DevicePath \"\"" Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.362959 4713 generic.go:334] "Generic (PLEG): container finished" podID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerID="4148fb00f07200686527613ac1c76a65f395fb8b85487e33c0f1682137ad625e" exitCode=0 Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.363074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerDied","Data":"4148fb00f07200686527613ac1c76a65f395fb8b85487e33c0f1682137ad625e"} Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.364582 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557788-72k4j" event={"ID":"046a191f-1297-43e4-ad80-9cfdad08202b","Type":"ContainerDied","Data":"7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63"} Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.364609 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aaaa6b150a323ac23a2dde6435d3e99fa1b1a3e73c605a3891fe3b0b25c8d63" Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.364685 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557788-72k4j" Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.787332 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557782-dpssc"] Mar 14 05:48:05 crc kubenswrapper[4713]: I0314 05:48:05.793193 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557782-dpssc"] Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.812363 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.893847 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnsxk\" (UniqueName: \"kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk\") pod \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.893990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle\") pod \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.894105 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util\") pod \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\" (UID: \"90fc82f6-b9df-45d9-bdc7-6eae42f17b64\") " Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.895020 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle" (OuterVolumeSpecName: "bundle") pod "90fc82f6-b9df-45d9-bdc7-6eae42f17b64" (UID: "90fc82f6-b9df-45d9-bdc7-6eae42f17b64"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.899225 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk" (OuterVolumeSpecName: "kube-api-access-mnsxk") pod "90fc82f6-b9df-45d9-bdc7-6eae42f17b64" (UID: "90fc82f6-b9df-45d9-bdc7-6eae42f17b64"). InnerVolumeSpecName "kube-api-access-mnsxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.913902 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util" (OuterVolumeSpecName: "util") pod "90fc82f6-b9df-45d9-bdc7-6eae42f17b64" (UID: "90fc82f6-b9df-45d9-bdc7-6eae42f17b64"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.995846 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-util\") on node \"crc\" DevicePath \"\"" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.995895 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnsxk\" (UniqueName: \"kubernetes.io/projected/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-kube-api-access-mnsxk\") on node \"crc\" DevicePath \"\"" Mar 14 05:48:06 crc kubenswrapper[4713]: I0314 05:48:06.995912 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90fc82f6-b9df-45d9-bdc7-6eae42f17b64-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:48:07 crc kubenswrapper[4713]: I0314 05:48:07.503668 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" event={"ID":"90fc82f6-b9df-45d9-bdc7-6eae42f17b64","Type":"ContainerDied","Data":"a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23"} Mar 14 05:48:07 crc kubenswrapper[4713]: I0314 05:48:07.503714 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a768b38eecd081c8d85a4fa78705c750ff76b34c6c2692f7289ec6a15104ff23" Mar 14 05:48:07 crc kubenswrapper[4713]: I0314 05:48:07.503726 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6" Mar 14 05:48:07 crc kubenswrapper[4713]: I0314 05:48:07.577702 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a52a89-d171-46d2-9a6d-6263fb859454" path="/var/lib/kubelet/pods/32a52a89-d171-46d2-9a6d-6263fb859454/volumes" Mar 14 05:48:10 crc kubenswrapper[4713]: I0314 05:48:10.732071 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:48:10 crc kubenswrapper[4713]: I0314 05:48:10.732685 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:48:11 crc kubenswrapper[4713]: I0314 05:48:11.011685 4713 scope.go:117] "RemoveContainer" containerID="7df476fc18b38503e61dd067ffe79ff0d7f2f7df934b3a84918ed470e578f06e" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.573690 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2"] Mar 14 05:48:13 crc kubenswrapper[4713]: E0314 05:48:13.574188 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046a191f-1297-43e4-ad80-9cfdad08202b" containerName="oc" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574199 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="046a191f-1297-43e4-ad80-9cfdad08202b" containerName="oc" Mar 14 05:48:13 crc kubenswrapper[4713]: E0314 05:48:13.574241 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="extract" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574246 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="extract" Mar 14 05:48:13 crc kubenswrapper[4713]: E0314 05:48:13.574265 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="util" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574270 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="util" Mar 14 05:48:13 crc kubenswrapper[4713]: E0314 05:48:13.574280 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="pull" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574285 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="pull" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574446 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fc82f6-b9df-45d9-bdc7-6eae42f17b64" containerName="extract" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.574462 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="046a191f-1297-43e4-ad80-9cfdad08202b" containerName="oc" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.575004 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.578707 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4kvft" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.647132 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2"] Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.695821 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2ck\" (UniqueName: \"kubernetes.io/projected/129ebe3f-95aa-42f1-8f56-1d3120fb5419-kube-api-access-rb2ck\") pod \"openstack-operator-controller-init-64f68cccc7-5r6v2\" (UID: \"129ebe3f-95aa-42f1-8f56-1d3120fb5419\") " pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.797995 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2ck\" (UniqueName: \"kubernetes.io/projected/129ebe3f-95aa-42f1-8f56-1d3120fb5419-kube-api-access-rb2ck\") pod \"openstack-operator-controller-init-64f68cccc7-5r6v2\" (UID: \"129ebe3f-95aa-42f1-8f56-1d3120fb5419\") " pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.824168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2ck\" (UniqueName: \"kubernetes.io/projected/129ebe3f-95aa-42f1-8f56-1d3120fb5419-kube-api-access-rb2ck\") pod \"openstack-operator-controller-init-64f68cccc7-5r6v2\" (UID: \"129ebe3f-95aa-42f1-8f56-1d3120fb5419\") " pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:13 crc kubenswrapper[4713]: I0314 05:48:13.895020 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:14 crc kubenswrapper[4713]: I0314 05:48:14.348068 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2"] Mar 14 05:48:14 crc kubenswrapper[4713]: I0314 05:48:14.554042 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" event={"ID":"129ebe3f-95aa-42f1-8f56-1d3120fb5419","Type":"ContainerStarted","Data":"0e0bd6fc66ad744d962ef934626428208020a043eaaf382f69abbf74807d530e"} Mar 14 05:48:21 crc kubenswrapper[4713]: I0314 05:48:21.626122 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" event={"ID":"129ebe3f-95aa-42f1-8f56-1d3120fb5419","Type":"ContainerStarted","Data":"d2c0f69abd01632b513051ef4714ecef107efeaaba50f224b2dbe38709b8db9a"} Mar 14 05:48:21 crc kubenswrapper[4713]: I0314 05:48:21.626733 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:33 crc kubenswrapper[4713]: I0314 05:48:33.898133 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" Mar 14 05:48:33 crc kubenswrapper[4713]: I0314 05:48:33.931878 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podStartSLOduration=14.412154305 podStartE2EDuration="20.931854132s" podCreationTimestamp="2026-03-14 05:48:13 +0000 UTC" firstStartedPulling="2026-03-14 05:48:14.347598848 +0000 UTC m=+1277.435508158" lastFinishedPulling="2026-03-14 05:48:20.867298685 +0000 UTC m=+1283.955207985" observedRunningTime="2026-03-14 05:48:21.657819182 +0000 UTC m=+1284.745728482" watchObservedRunningTime="2026-03-14 05:48:33.931854132 +0000 UTC m=+1297.019763432" Mar 14 05:48:40 crc kubenswrapper[4713]: I0314 05:48:40.731353 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:48:40 crc kubenswrapper[4713]: I0314 05:48:40.731784 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:48:40 crc kubenswrapper[4713]: I0314 05:48:40.731829 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:48:40 crc kubenswrapper[4713]: I0314 05:48:40.732452 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:48:40 crc kubenswrapper[4713]: I0314 05:48:40.732505 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e" gracePeriod=600 Mar 14 05:48:41 crc kubenswrapper[4713]: I0314 05:48:41.017843 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e" exitCode=0 Mar 14 05:48:41 crc kubenswrapper[4713]: I0314 05:48:41.017898 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e"} Mar 14 05:48:41 crc kubenswrapper[4713]: I0314 05:48:41.017939 4713 scope.go:117] "RemoveContainer" containerID="b10f030c50b79b6c8fa097a372898693b75a3027ed8338227f2cd1cda4fb2db1" Mar 14 05:48:42 crc kubenswrapper[4713]: I0314 05:48:42.044228 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419"} Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.276199 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.277581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.282633 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lxm89" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.287160 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.288471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.293006 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zmv4g" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.298160 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.306770 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.308200 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.309850 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-m4w6k" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.336670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.368907 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.370306 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.383959 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lx7tb" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.391768 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.401096 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.412822 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.414025 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.420626 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jqnkl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.427741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cpq\" (UniqueName: \"kubernetes.io/projected/12eb62d0-8721-4482-b4a3-148a61cea029-kube-api-access-62cpq\") pod \"barbican-operator-controller-manager-d47688694-kq9dl\" (UID: \"12eb62d0-8721-4482-b4a3-148a61cea029\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.427823 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whf7z\" (UniqueName: \"kubernetes.io/projected/fa62dff3-1643-4e94-b31a-d56b21a2327d-kube-api-access-whf7z\") pod \"designate-operator-controller-manager-66d56f6ff4-j4474\" (UID: \"fa62dff3-1643-4e94-b31a-d56b21a2327d\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.427890 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbshx\" (UniqueName: \"kubernetes.io/projected/4128f2c6-d929-4815-8502-291baf22f24f-kube-api-access-vbshx\") pod \"cinder-operator-controller-manager-984cd4dcf-6j4tq\" (UID: \"4128f2c6-d929-4815-8502-291baf22f24f\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.435360 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.436412 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.440698 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-plbgk" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.443336 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.480314 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.508965 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.516279 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vrpcl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.517462 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.532900 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534377 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62cpq\" (UniqueName: \"kubernetes.io/projected/12eb62d0-8721-4482-b4a3-148a61cea029-kube-api-access-62cpq\") pod \"barbican-operator-controller-manager-d47688694-kq9dl\" (UID: \"12eb62d0-8721-4482-b4a3-148a61cea029\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534506 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whf7z\" (UniqueName: \"kubernetes.io/projected/fa62dff3-1643-4e94-b31a-d56b21a2327d-kube-api-access-whf7z\") pod \"designate-operator-controller-manager-66d56f6ff4-j4474\" (UID: \"fa62dff3-1643-4e94-b31a-d56b21a2327d\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534576 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbshx\" (UniqueName: \"kubernetes.io/projected/4128f2c6-d929-4815-8502-291baf22f24f-kube-api-access-vbshx\") pod \"cinder-operator-controller-manager-984cd4dcf-6j4tq\" (UID: \"4128f2c6-d929-4815-8502-291baf22f24f\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534664 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jj6z\" (UniqueName: \"kubernetes.io/projected/6a712708-53c3-4854-9a45-3442ee780cdc-kube-api-access-2jj6z\") pod \"horizon-operator-controller-manager-6d9d6b584d-j587n\" (UID: \"6a712708-53c3-4854-9a45-3442ee780cdc\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534694 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj48\" (UniqueName: \"kubernetes.io/projected/ca4c3a10-3f60-460b-ad21-258a757bf57c-kube-api-access-7hj48\") pod \"heat-operator-controller-manager-77b6666d85-xvmqf\" (UID: \"ca4c3a10-3f60-460b-ad21-258a757bf57c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.534725 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dct7r\" (UniqueName: \"kubernetes.io/projected/bae008e7-4329-4d30-9820-81daf4300f96-kube-api-access-dct7r\") pod \"glance-operator-controller-manager-5964f64c48-zz92h\" (UID: \"bae008e7-4329-4d30-9820-81daf4300f96\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.571877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cpq\" (UniqueName: \"kubernetes.io/projected/12eb62d0-8721-4482-b4a3-148a61cea029-kube-api-access-62cpq\") pod \"barbican-operator-controller-manager-d47688694-kq9dl\" (UID: \"12eb62d0-8721-4482-b4a3-148a61cea029\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.574838 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whf7z\" (UniqueName: \"kubernetes.io/projected/fa62dff3-1643-4e94-b31a-d56b21a2327d-kube-api-access-whf7z\") pod \"designate-operator-controller-manager-66d56f6ff4-j4474\" (UID: \"fa62dff3-1643-4e94-b31a-d56b21a2327d\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.578090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbshx\" (UniqueName: \"kubernetes.io/projected/4128f2c6-d929-4815-8502-291baf22f24f-kube-api-access-vbshx\") pod \"cinder-operator-controller-manager-984cd4dcf-6j4tq\" (UID: \"4128f2c6-d929-4815-8502-291baf22f24f\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.588441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.612296 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.613005 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.638310 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.639434 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.640943 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.643771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.643827 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jj6z\" (UniqueName: \"kubernetes.io/projected/6a712708-53c3-4854-9a45-3442ee780cdc-kube-api-access-2jj6z\") pod \"horizon-operator-controller-manager-6d9d6b584d-j587n\" (UID: \"6a712708-53c3-4854-9a45-3442ee780cdc\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.643856 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj48\" (UniqueName: \"kubernetes.io/projected/ca4c3a10-3f60-460b-ad21-258a757bf57c-kube-api-access-7hj48\") pod \"heat-operator-controller-manager-77b6666d85-xvmqf\" (UID: \"ca4c3a10-3f60-460b-ad21-258a757bf57c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.643878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zvc\" (UniqueName: \"kubernetes.io/projected/cbc588fa-b052-4336-81fe-2fed809e251b-kube-api-access-66zvc\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.643905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dct7r\" (UniqueName: \"kubernetes.io/projected/bae008e7-4329-4d30-9820-81daf4300f96-kube-api-access-dct7r\") pod \"glance-operator-controller-manager-5964f64c48-zz92h\" (UID: \"bae008e7-4329-4d30-9820-81daf4300f96\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.646380 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7d45g" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.660271 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.661840 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.683078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dct7r\" (UniqueName: \"kubernetes.io/projected/bae008e7-4329-4d30-9820-81daf4300f96-kube-api-access-dct7r\") pod \"glance-operator-controller-manager-5964f64c48-zz92h\" (UID: \"bae008e7-4329-4d30-9820-81daf4300f96\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.688154 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj48\" (UniqueName: \"kubernetes.io/projected/ca4c3a10-3f60-460b-ad21-258a757bf57c-kube-api-access-7hj48\") pod \"heat-operator-controller-manager-77b6666d85-xvmqf\" (UID: \"ca4c3a10-3f60-460b-ad21-258a757bf57c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.688631 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mfm7c" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.693578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.699287 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jj6z\" (UniqueName: \"kubernetes.io/projected/6a712708-53c3-4854-9a45-3442ee780cdc-kube-api-access-2jj6z\") pod \"horizon-operator-controller-manager-6d9d6b584d-j587n\" (UID: \"6a712708-53c3-4854-9a45-3442ee780cdc\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.706943 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.721258 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.741737 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.752279 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.752352 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zvc\" (UniqueName: \"kubernetes.io/projected/cbc588fa-b052-4336-81fe-2fed809e251b-kube-api-access-66zvc\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.752428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw54d\" (UniqueName: \"kubernetes.io/projected/9da309ad-34cc-4b06-b166-c571b5a39825-kube-api-access-sw54d\") pod \"ironic-operator-controller-manager-5bc894d9b-6xshs\" (UID: \"9da309ad-34cc-4b06-b166-c571b5a39825\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.752498 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gs6q\" (UniqueName: \"kubernetes.io/projected/46274028-feea-4c48-b086-44533fc3e996-kube-api-access-7gs6q\") pod \"keystone-operator-controller-manager-684f77d66d-shvlw\" (UID: \"46274028-feea-4c48-b086-44533fc3e996\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:00 crc kubenswrapper[4713]: E0314 05:49:00.752706 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:00 crc kubenswrapper[4713]: E0314 05:49:00.752763 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:01.252739905 +0000 UTC m=+1324.340649205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.760280 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.762773 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.766851 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.767353 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.767993 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7q8j4" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.768131 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.770623 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-j5z8k" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.792712 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.792751 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.797522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zvc\" (UniqueName: \"kubernetes.io/projected/cbc588fa-b052-4336-81fe-2fed809e251b-kube-api-access-66zvc\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.809690 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-8mm88"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.824927 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.825368 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.826540 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.830633 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-57ddd" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.830928 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-h9gdd" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.853249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.854360 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qq6\" (UniqueName: \"kubernetes.io/projected/149dd450-69f3-4d71-aac3-90052dcf2253-kube-api-access-67qq6\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-46gkk\" (UID: \"149dd450-69f3-4d71-aac3-90052dcf2253\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.854458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw54d\" (UniqueName: \"kubernetes.io/projected/9da309ad-34cc-4b06-b166-c571b5a39825-kube-api-access-sw54d\") pod \"ironic-operator-controller-manager-5bc894d9b-6xshs\" (UID: \"9da309ad-34cc-4b06-b166-c571b5a39825\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.854511 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrj8\" (UniqueName: \"kubernetes.io/projected/ccd30d62-2e42-4399-b1eb-dfde3782dcb8-kube-api-access-wgrj8\") pod \"manila-operator-controller-manager-57b484b4df-gbrmh\" (UID: \"ccd30d62-2e42-4399-b1eb-dfde3782dcb8\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.858525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gs6q\" (UniqueName: \"kubernetes.io/projected/46274028-feea-4c48-b086-44533fc3e996-kube-api-access-7gs6q\") pod \"keystone-operator-controller-manager-684f77d66d-shvlw\" (UID: \"46274028-feea-4c48-b086-44533fc3e996\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.880608 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw54d\" (UniqueName: \"kubernetes.io/projected/9da309ad-34cc-4b06-b166-c571b5a39825-kube-api-access-sw54d\") pod \"ironic-operator-controller-manager-5bc894d9b-6xshs\" (UID: \"9da309ad-34cc-4b06-b166-c571b5a39825\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.880671 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-8mm88"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.895558 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.897870 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.899862 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zzgwp" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.902099 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gs6q\" (UniqueName: \"kubernetes.io/projected/46274028-feea-4c48-b086-44533fc3e996-kube-api-access-7gs6q\") pod \"keystone-operator-controller-manager-684f77d66d-shvlw\" (UID: \"46274028-feea-4c48-b086-44533fc3e996\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.914429 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.931570 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2"] Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.933526 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.941791 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.941991 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sqgk6" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.966980 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pqp5\" (UniqueName: \"kubernetes.io/projected/50d43641-0638-4763-9123-0c0c2c76629e-kube-api-access-2pqp5\") pod \"octavia-operator-controller-manager-5f4f55cb5c-nfgb5\" (UID: \"50d43641-0638-4763-9123-0c0c2c76629e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.967076 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5tg4\" (UniqueName: \"kubernetes.io/projected/409a2a8b-7e66-4763-9698-3a909f051c50-kube-api-access-f5tg4\") pod \"nova-operator-controller-manager-7f84474648-8mm88\" (UID: \"409a2a8b-7e66-4763-9698-3a909f051c50\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.968302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrj8\" (UniqueName: \"kubernetes.io/projected/ccd30d62-2e42-4399-b1eb-dfde3782dcb8-kube-api-access-wgrj8\") pod \"manila-operator-controller-manager-57b484b4df-gbrmh\" (UID: \"ccd30d62-2e42-4399-b1eb-dfde3782dcb8\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.968888 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgw4p\" (UniqueName: \"kubernetes.io/projected/d5c6be47-5c06-46e0-ae8c-87b7a3f23561-kube-api-access-lgw4p\") pod \"neutron-operator-controller-manager-776c5696bf-7t4g8\" (UID: \"d5c6be47-5c06-46e0-ae8c-87b7a3f23561\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:00 crc kubenswrapper[4713]: I0314 05:49:00.969096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qq6\" (UniqueName: \"kubernetes.io/projected/149dd450-69f3-4d71-aac3-90052dcf2253-kube-api-access-67qq6\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-46gkk\" (UID: \"149dd450-69f3-4d71-aac3-90052dcf2253\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.001584 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.009041 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qq6\" (UniqueName: \"kubernetes.io/projected/149dd450-69f3-4d71-aac3-90052dcf2253-kube-api-access-67qq6\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-46gkk\" (UID: \"149dd450-69f3-4d71-aac3-90052dcf2253\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.012833 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrj8\" (UniqueName: \"kubernetes.io/projected/ccd30d62-2e42-4399-b1eb-dfde3782dcb8-kube-api-access-wgrj8\") pod \"manila-operator-controller-manager-57b484b4df-gbrmh\" (UID: \"ccd30d62-2e42-4399-b1eb-dfde3782dcb8\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.044558 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.046144 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.046827 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.053262 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-l478v" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.053482 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-lzz5l" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.070044 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.071341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5tg4\" (UniqueName: \"kubernetes.io/projected/409a2a8b-7e66-4763-9698-3a909f051c50-kube-api-access-f5tg4\") pod \"nova-operator-controller-manager-7f84474648-8mm88\" (UID: \"409a2a8b-7e66-4763-9698-3a909f051c50\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.071390 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xdmw\" (UniqueName: \"kubernetes.io/projected/4da1ed21-82a5-400c-a201-653fe58adf4c-kube-api-access-4xdmw\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.071423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgw4p\" (UniqueName: \"kubernetes.io/projected/d5c6be47-5c06-46e0-ae8c-87b7a3f23561-kube-api-access-lgw4p\") pod \"neutron-operator-controller-manager-776c5696bf-7t4g8\" (UID: \"d5c6be47-5c06-46e0-ae8c-87b7a3f23561\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.071459 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.071592 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pqp5\" (UniqueName: \"kubernetes.io/projected/50d43641-0638-4763-9123-0c0c2c76629e-kube-api-access-2pqp5\") pod \"octavia-operator-controller-manager-5f4f55cb5c-nfgb5\" (UID: \"50d43641-0638-4763-9123-0c0c2c76629e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.094313 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.108771 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.112647 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5tg4\" (UniqueName: \"kubernetes.io/projected/409a2a8b-7e66-4763-9698-3a909f051c50-kube-api-access-f5tg4\") pod \"nova-operator-controller-manager-7f84474648-8mm88\" (UID: \"409a2a8b-7e66-4763-9698-3a909f051c50\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.118458 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.119838 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.127117 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.128744 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4wzsk" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.128895 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.136058 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.140064 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgw4p\" (UniqueName: \"kubernetes.io/projected/d5c6be47-5c06-46e0-ae8c-87b7a3f23561-kube-api-access-lgw4p\") pod \"neutron-operator-controller-manager-776c5696bf-7t4g8\" (UID: \"d5c6be47-5c06-46e0-ae8c-87b7a3f23561\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.140445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.140928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pqp5\" (UniqueName: \"kubernetes.io/projected/50d43641-0638-4763-9123-0c0c2c76629e-kube-api-access-2pqp5\") pod \"octavia-operator-controller-manager-5f4f55cb5c-nfgb5\" (UID: \"50d43641-0638-4763-9123-0c0c2c76629e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.147790 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2tnv6" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.151007 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.156795 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.160270 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.161440 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.169869 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vblhf" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.173711 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfvd\" (UniqueName: \"kubernetes.io/projected/a55d0754-702d-4dbc-995a-b98d852678ce-kube-api-access-ztfvd\") pod \"ovn-operator-controller-manager-bbc5b68f9-kv59d\" (UID: \"a55d0754-702d-4dbc-995a-b98d852678ce\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.173798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xdmw\" (UniqueName: \"kubernetes.io/projected/4da1ed21-82a5-400c-a201-653fe58adf4c-kube-api-access-4xdmw\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.173870 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.173909 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbzx\" (UniqueName: \"kubernetes.io/projected/aa4ff369-f2af-439f-b9f6-2c8301e80210-kube-api-access-smbzx\") pod \"placement-operator-controller-manager-574d45c66c-xmbz2\" (UID: \"aa4ff369-f2af-439f-b9f6-2c8301e80210\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.174500 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.174555 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:01.674532099 +0000 UTC m=+1324.762441399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.182787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.191391 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.206736 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.251023 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xdmw\" (UniqueName: \"kubernetes.io/projected/4da1ed21-82a5-400c-a201-653fe58adf4c-kube-api-access-4xdmw\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.283301 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.298047 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.310307 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.312792 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.321338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.322752 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qzhv\" (UniqueName: \"kubernetes.io/projected/383e8493-0661-4b45-a72c-5851b520c65b-kube-api-access-2qzhv\") pod \"swift-operator-controller-manager-7f9cc5dd44-2r2fz\" (UID: \"383e8493-0661-4b45-a72c-5851b520c65b\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.322826 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5grr\" (UniqueName: \"kubernetes.io/projected/fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd-kube-api-access-h5grr\") pod \"test-operator-controller-manager-5c5cb9c4d7-6lw5h\" (UID: \"fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.322856 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfvd\" (UniqueName: \"kubernetes.io/projected/a55d0754-702d-4dbc-995a-b98d852678ce-kube-api-access-ztfvd\") pod \"ovn-operator-controller-manager-bbc5b68f9-kv59d\" (UID: \"a55d0754-702d-4dbc-995a-b98d852678ce\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.322965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbzx\" (UniqueName: \"kubernetes.io/projected/aa4ff369-f2af-439f-b9f6-2c8301e80210-kube-api-access-smbzx\") pod \"placement-operator-controller-manager-574d45c66c-xmbz2\" (UID: \"aa4ff369-f2af-439f-b9f6-2c8301e80210\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.323030 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2sb\" (UniqueName: \"kubernetes.io/projected/4101fac4-706c-4e2b-9203-102d0874c3ba-kube-api-access-6m2sb\") pod \"telemetry-operator-controller-manager-75b7bc4c47-ltr87\" (UID: \"4101fac4-706c-4e2b-9203-102d0874c3ba\") " pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.323072 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.323284 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.323338 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:02.323322221 +0000 UTC m=+1325.411231521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.340055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-h95jf" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.386850 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.405915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbzx\" (UniqueName: \"kubernetes.io/projected/aa4ff369-f2af-439f-b9f6-2c8301e80210-kube-api-access-smbzx\") pod \"placement-operator-controller-manager-574d45c66c-xmbz2\" (UID: \"aa4ff369-f2af-439f-b9f6-2c8301e80210\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.408245 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfvd\" (UniqueName: \"kubernetes.io/projected/a55d0754-702d-4dbc-995a-b98d852678ce-kube-api-access-ztfvd\") pod \"ovn-operator-controller-manager-bbc5b68f9-kv59d\" (UID: \"a55d0754-702d-4dbc-995a-b98d852678ce\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.425249 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59b46\" (UniqueName: \"kubernetes.io/projected/3941b4bd-470d-4351-aed9-4bc1f90f9ad4-kube-api-access-59b46\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pqnf9\" (UID: \"3941b4bd-470d-4351-aed9-4bc1f90f9ad4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.425384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2sb\" (UniqueName: \"kubernetes.io/projected/4101fac4-706c-4e2b-9203-102d0874c3ba-kube-api-access-6m2sb\") pod \"telemetry-operator-controller-manager-75b7bc4c47-ltr87\" (UID: \"4101fac4-706c-4e2b-9203-102d0874c3ba\") " pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.425433 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qzhv\" (UniqueName: \"kubernetes.io/projected/383e8493-0661-4b45-a72c-5851b520c65b-kube-api-access-2qzhv\") pod \"swift-operator-controller-manager-7f9cc5dd44-2r2fz\" (UID: \"383e8493-0661-4b45-a72c-5851b520c65b\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.425474 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5grr\" (UniqueName: \"kubernetes.io/projected/fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd-kube-api-access-h5grr\") pod \"test-operator-controller-manager-5c5cb9c4d7-6lw5h\" (UID: \"fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.456399 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.457732 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.462264 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.471719 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.472712 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-445h4" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.472884 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.491866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5grr\" (UniqueName: \"kubernetes.io/projected/fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd-kube-api-access-h5grr\") pod \"test-operator-controller-manager-5c5cb9c4d7-6lw5h\" (UID: \"fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.500186 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.501129 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.515055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vzhm6" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.519491 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qzhv\" (UniqueName: \"kubernetes.io/projected/383e8493-0661-4b45-a72c-5851b520c65b-kube-api-access-2qzhv\") pod \"swift-operator-controller-manager-7f9cc5dd44-2r2fz\" (UID: \"383e8493-0661-4b45-a72c-5851b520c65b\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.529523 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2sb\" (UniqueName: \"kubernetes.io/projected/4101fac4-706c-4e2b-9203-102d0874c3ba-kube-api-access-6m2sb\") pod \"telemetry-operator-controller-manager-75b7bc4c47-ltr87\" (UID: \"4101fac4-706c-4e2b-9203-102d0874c3ba\") " pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.530293 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ld7\" (UniqueName: \"kubernetes.io/projected/92164fd9-b08c-4b00-975c-0fcdd245f8f9-kube-api-access-d9ld7\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.530367 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.530422 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59b46\" (UniqueName: \"kubernetes.io/projected/3941b4bd-470d-4351-aed9-4bc1f90f9ad4-kube-api-access-59b46\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pqnf9\" (UID: \"3941b4bd-470d-4351-aed9-4bc1f90f9ad4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.530443 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.562397 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55"] Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.579563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59b46\" (UniqueName: \"kubernetes.io/projected/3941b4bd-470d-4351-aed9-4bc1f90f9ad4-kube-api-access-59b46\") pod \"watcher-operator-controller-manager-6c4d75f7f9-pqnf9\" (UID: \"3941b4bd-470d-4351-aed9-4bc1f90f9ad4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.583688 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.588672 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.645312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ld7\" (UniqueName: \"kubernetes.io/projected/92164fd9-b08c-4b00-975c-0fcdd245f8f9-kube-api-access-d9ld7\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.645446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.645589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.645743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fddr\" (UniqueName: \"kubernetes.io/projected/9be88166-e4c0-464c-9dc0-a8a51595c555-kube-api-access-8fddr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlh55\" (UID: \"9be88166-e4c0-464c-9dc0-a8a51595c555\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.645788 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.645829 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.645865 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:02.145846391 +0000 UTC m=+1325.233755691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.645897 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:02.145876902 +0000 UTC m=+1325.233786272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.663289 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ld7\" (UniqueName: \"kubernetes.io/projected/92164fd9-b08c-4b00-975c-0fcdd245f8f9-kube-api-access-d9ld7\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.747757 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.747823 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fddr\" (UniqueName: \"kubernetes.io/projected/9be88166-e4c0-464c-9dc0-a8a51595c555-kube-api-access-8fddr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlh55\" (UID: \"9be88166-e4c0-464c-9dc0-a8a51595c555\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.749965 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: E0314 05:49:01.750033 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:02.750012341 +0000 UTC m=+1325.837921661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.771073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fddr\" (UniqueName: \"kubernetes.io/projected/9be88166-e4c0-464c-9dc0-a8a51595c555-kube-api-access-8fddr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlh55\" (UID: \"9be88166-e4c0-464c-9dc0-a8a51595c555\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" Mar 14 05:49:01 crc kubenswrapper[4713]: I0314 05:49:01.894598 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.052465 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.095867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.155285 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq"] Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.160077 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.160257 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.160373 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.160425 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.160471 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:03.160448903 +0000 UTC m=+1326.248358203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.160510 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:03.160489674 +0000 UTC m=+1326.248399044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.202562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.231935 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.364639 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.365424 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.365488 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:04.365451787 +0000 UTC m=+1327.453361087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.774636 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.774800 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:02.774864 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:04.774848286 +0000 UTC m=+1327.862757576 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.854284 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474"] Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.894848 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h"] Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.903043 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf"] Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:02.910524 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl"] Mar 14 05:49:03 crc kubenswrapper[4713]: W0314 05:49:02.911581 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12eb62d0_8721_4482_b4a3_148a61cea029.slice/crio-daf33fed665ab2f390bdcb97d9419e576fd6babf05ac5722e9d2c9c76261394c WatchSource:0}: Error finding container daf33fed665ab2f390bdcb97d9419e576fd6babf05ac5722e9d2c9c76261394c: Status 404 returned error can't find the container with id daf33fed665ab2f390bdcb97d9419e576fd6babf05ac5722e9d2c9c76261394c Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:03.195523 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:03.196075 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:05.196051571 +0000 UTC m=+1328.283960871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.195342 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.196690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:03.196786 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: E0314 05:49:03.196825 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:05.196814475 +0000 UTC m=+1328.284723775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.269315 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" event={"ID":"4128f2c6-d929-4815-8502-291baf22f24f","Type":"ContainerStarted","Data":"b01d62bdbc7da03822c07826d51148722e5d42c000a2d7d9e809ffdba15485ab"} Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.278494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" event={"ID":"ca4c3a10-3f60-460b-ad21-258a757bf57c","Type":"ContainerStarted","Data":"5c0d20f6b9746684eb18d7ce9888f8c7253e898e528b2d7b960815927bf7fefa"} Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.282872 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" event={"ID":"fa62dff3-1643-4e94-b31a-d56b21a2327d","Type":"ContainerStarted","Data":"46d85f1145783a0af7133b33f9ef25926ed282d1fff12b0cdea95d04fa04c135"} Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.289447 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" event={"ID":"12eb62d0-8721-4482-b4a3-148a61cea029","Type":"ContainerStarted","Data":"daf33fed665ab2f390bdcb97d9419e576fd6babf05ac5722e9d2c9c76261394c"} Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.307074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" event={"ID":"bae008e7-4329-4d30-9820-81daf4300f96","Type":"ContainerStarted","Data":"73f47638e14af536c1abb54ff04814f7ff1cd9b58031a1f822ee7594a5e84e72"} Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.408431 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw"] Mar 14 05:49:03 crc kubenswrapper[4713]: I0314 05:49:03.478493 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n"] Mar 14 05:49:03 crc kubenswrapper[4713]: W0314 05:49:03.489529 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46274028_feea_4c48_b086_44533fc3e996.slice/crio-c9c905c9933503289ae7d041580c39c7576c81eb1770dcac78a71f90ab980e39 WatchSource:0}: Error finding container c9c905c9933503289ae7d041580c39c7576c81eb1770dcac78a71f90ab980e39: Status 404 returned error can't find the container with id c9c905c9933503289ae7d041580c39c7576c81eb1770dcac78a71f90ab980e39 Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.331510 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" event={"ID":"46274028-feea-4c48-b086-44533fc3e996","Type":"ContainerStarted","Data":"c9c905c9933503289ae7d041580c39c7576c81eb1770dcac78a71f90ab980e39"} Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.333836 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" event={"ID":"6a712708-53c3-4854-9a45-3442ee780cdc","Type":"ContainerStarted","Data":"e7eeea883a1553d69d83c27d982db9a469dd3e6934dd499c575f6cf937d34ac4"} Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.438589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.439296 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.439368 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:08.439349828 +0000 UTC m=+1331.527259128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.595079 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.611786 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.623630 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.684166 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.692434 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.709160 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.744676 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55"] Mar 14 05:49:04 crc kubenswrapper[4713]: W0314 05:49:04.749411 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa4ff369_f2af_439f_b9f6_2c8301e80210.slice/crio-ddafedcf6ae781fa60159f6ae7ee8799ce5c97bc34d8c37ca3685e650b381aa4 WatchSource:0}: Error finding container ddafedcf6ae781fa60159f6ae7ee8799ce5c97bc34d8c37ca3685e650b381aa4: Status 404 returned error can't find the container with id ddafedcf6ae781fa60159f6ae7ee8799ce5c97bc34d8c37ca3685e650b381aa4 Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.761010 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.768530 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk"] Mar 14 05:49:04 crc kubenswrapper[4713]: W0314 05:49:04.770262 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9da309ad_34cc_4b06_b166_c571b5a39825.slice/crio-7d83e3bfb726e4374192ff44126ef71928eeb4321f1532c7854d0e811ebc498d WatchSource:0}: Error finding container 7d83e3bfb726e4374192ff44126ef71928eeb4321f1532c7854d0e811ebc498d: Status 404 returned error can't find the container with id 7d83e3bfb726e4374192ff44126ef71928eeb4321f1532c7854d0e811ebc498d Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.779764 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-8mm88"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.788607 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8"] Mar 14 05:49:04 crc kubenswrapper[4713]: W0314 05:49:04.797607 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149dd450_69f3_4d71_aac3_90052dcf2253.slice/crio-4cc3b53f1e70ab00489727c975adac8eb5caf85d86a9ab3ae8b687dba779bd0a WatchSource:0}: Error finding container 4cc3b53f1e70ab00489727c975adac8eb5caf85d86a9ab3ae8b687dba779bd0a: Status 404 returned error can't find the container with id 4cc3b53f1e70ab00489727c975adac8eb5caf85d86a9ab3ae8b687dba779bd0a Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.800287 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.811061 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87"] Mar 14 05:49:04 crc kubenswrapper[4713]: I0314 05:49:04.867650 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.868293 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.868359 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:08.868337572 +0000 UTC m=+1331.956246872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.943000 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.177:5001/openstack-k8s-operators/telemetry-operator:ba5ab716edb70fdd2398e138f71c819bd2f08328,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6m2sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-75b7bc4c47-ltr87_openstack-operators(4101fac4-706c-4e2b-9203-102d0874c3ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.943113 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5grr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-6lw5h_openstack-operators(fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.944282 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" podUID="4101fac4-706c-4e2b-9203-102d0874c3ba" Mar 14 05:49:04 crc kubenswrapper[4713]: E0314 05:49:04.944879 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.278065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.278145 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.278286 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.278555 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.278647 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:09.278622188 +0000 UTC m=+1332.366531488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.278666 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:09.278658519 +0000 UTC m=+1332.366567819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.360961 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" event={"ID":"149dd450-69f3-4d71-aac3-90052dcf2253","Type":"ContainerStarted","Data":"4cc3b53f1e70ab00489727c975adac8eb5caf85d86a9ab3ae8b687dba779bd0a"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.365334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" event={"ID":"fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd","Type":"ContainerStarted","Data":"dbc52cd22da6b00f469f4842bc49c7d4c8a867c5f9460d273d40b6f8e7d1e3df"} Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.367327 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.367750 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" event={"ID":"3941b4bd-470d-4351-aed9-4bc1f90f9ad4","Type":"ContainerStarted","Data":"a897f211227cf6eaec5efcc1bdcd916bafac78f00411a0d0edecd12c3da3ff09"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.369830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" event={"ID":"383e8493-0661-4b45-a72c-5851b520c65b","Type":"ContainerStarted","Data":"bfd7fa072bbd93a07a71a6b271742b44e0a17c60f7fcdfeaedce8c3c76d429cb"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.371429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" event={"ID":"50d43641-0638-4763-9123-0c0c2c76629e","Type":"ContainerStarted","Data":"f61575d9ab00349860d3bfff252202c439875ebdc67dc53add8a5ac957dcc289"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.374133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" event={"ID":"409a2a8b-7e66-4763-9698-3a909f051c50","Type":"ContainerStarted","Data":"3c7de1741da338845bbf392069edc38681e8f4bea332a9d12be4a20515f56eea"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.375968 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" event={"ID":"d5c6be47-5c06-46e0-ae8c-87b7a3f23561","Type":"ContainerStarted","Data":"f9571eba8d3a7945646d2d80f13d0f915ebd522844e7e70770dc4ffe072c3bf1"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.386532 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" event={"ID":"aa4ff369-f2af-439f-b9f6-2c8301e80210","Type":"ContainerStarted","Data":"ddafedcf6ae781fa60159f6ae7ee8799ce5c97bc34d8c37ca3685e650b381aa4"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.395335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" event={"ID":"ccd30d62-2e42-4399-b1eb-dfde3782dcb8","Type":"ContainerStarted","Data":"5d03bcaa4facac9aae3c9b464c760362ccdf654cdeba55c24f2d7f872784b629"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.399021 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" event={"ID":"4101fac4-706c-4e2b-9203-102d0874c3ba","Type":"ContainerStarted","Data":"40c70b33f235f096687285d87714fba7aba6b5c8c844de82040d2e6f813ba068"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.402620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" event={"ID":"a55d0754-702d-4dbc-995a-b98d852678ce","Type":"ContainerStarted","Data":"fd0295963b9b2a9da1eea5fa6aa3f28148944a09c5106ea6749bab4ae52d5762"} Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.405980 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" event={"ID":"9be88166-e4c0-464c-9dc0-a8a51595c555","Type":"ContainerStarted","Data":"65a70455b015d7111c8928fc536e8c59c9b049e8c31d993ead894638cb12484b"} Mar 14 05:49:05 crc kubenswrapper[4713]: E0314 05:49:05.407119 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.177:5001/openstack-k8s-operators/telemetry-operator:ba5ab716edb70fdd2398e138f71c819bd2f08328\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" podUID="4101fac4-706c-4e2b-9203-102d0874c3ba" Mar 14 05:49:05 crc kubenswrapper[4713]: I0314 05:49:05.412186 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" event={"ID":"9da309ad-34cc-4b06-b166-c571b5a39825","Type":"ContainerStarted","Data":"7d83e3bfb726e4374192ff44126ef71928eeb4321f1532c7854d0e811ebc498d"} Mar 14 05:49:06 crc kubenswrapper[4713]: E0314 05:49:06.451894 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" Mar 14 05:49:06 crc kubenswrapper[4713]: E0314 05:49:06.452118 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.177:5001/openstack-k8s-operators/telemetry-operator:ba5ab716edb70fdd2398e138f71c819bd2f08328\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" podUID="4101fac4-706c-4e2b-9203-102d0874c3ba" Mar 14 05:49:08 crc kubenswrapper[4713]: I0314 05:49:08.484635 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:08 crc kubenswrapper[4713]: E0314 05:49:08.484852 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:08 crc kubenswrapper[4713]: E0314 05:49:08.485048 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:16.485030636 +0000 UTC m=+1339.572939936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:08 crc kubenswrapper[4713]: I0314 05:49:08.894837 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:08 crc kubenswrapper[4713]: E0314 05:49:08.895015 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:08 crc kubenswrapper[4713]: E0314 05:49:08.895087 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:16.895067895 +0000 UTC m=+1339.982977205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:09 crc kubenswrapper[4713]: I0314 05:49:09.301999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:09 crc kubenswrapper[4713]: E0314 05:49:09.302148 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:09 crc kubenswrapper[4713]: I0314 05:49:09.302414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:09 crc kubenswrapper[4713]: E0314 05:49:09.302466 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:17.302447689 +0000 UTC m=+1340.390356989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:09 crc kubenswrapper[4713]: E0314 05:49:09.302568 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:09 crc kubenswrapper[4713]: E0314 05:49:09.302626 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:17.302611735 +0000 UTC m=+1340.390521035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:16 crc kubenswrapper[4713]: I0314 05:49:16.529783 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:16 crc kubenswrapper[4713]: E0314 05:49:16.530139 4713 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:16 crc kubenswrapper[4713]: E0314 05:49:16.530575 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert podName:cbc588fa-b052-4336-81fe-2fed809e251b nodeName:}" failed. No retries permitted until 2026-03-14 05:49:32.530551111 +0000 UTC m=+1355.618460411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert") pod "infra-operator-controller-manager-54dc5b8f8d-fm6fr" (UID: "cbc588fa-b052-4336-81fe-2fed809e251b") : secret "infra-operator-webhook-server-cert" not found Mar 14 05:49:16 crc kubenswrapper[4713]: I0314 05:49:16.938123 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:16 crc kubenswrapper[4713]: E0314 05:49:16.938320 4713 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:16 crc kubenswrapper[4713]: E0314 05:49:16.938473 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert podName:4da1ed21-82a5-400c-a201-653fe58adf4c nodeName:}" failed. No retries permitted until 2026-03-14 05:49:32.938452993 +0000 UTC m=+1356.026362293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" (UID: "4da1ed21-82a5-400c-a201-653fe58adf4c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 05:49:17 crc kubenswrapper[4713]: I0314 05:49:17.347674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:17 crc kubenswrapper[4713]: I0314 05:49:17.347762 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.347849 4713 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.347929 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:33.347909462 +0000 UTC m=+1356.435818762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "metrics-server-cert" not found Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.347930 4713 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.347993 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs podName:92164fd9-b08c-4b00-975c-0fcdd245f8f9 nodeName:}" failed. No retries permitted until 2026-03-14 05:49:33.347973094 +0000 UTC m=+1356.435882474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs") pod "openstack-operator-controller-manager-85d9999fbb-kdnkw" (UID: "92164fd9-b08c-4b00-975c-0fcdd245f8f9") : secret "webhook-server-cert" not found Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.666052 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.666250 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7gs6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-shvlw_openstack-operators(46274028-feea-4c48-b086-44533fc3e996): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:17 crc kubenswrapper[4713]: E0314 05:49:17.667467 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podUID="46274028-feea-4c48-b086-44533fc3e996" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.255846 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.256349 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59b46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-pqnf9_openstack-operators(3941b4bd-470d-4351-aed9-4bc1f90f9ad4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.257528 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.583281 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.583307 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podUID="46274028-feea-4c48-b086-44533fc3e996" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.826859 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.827059 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62cpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-kq9dl_openstack-operators(12eb62d0-8721-4482-b4a3-148a61cea029): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:18 crc kubenswrapper[4713]: E0314 05:49:18.828281 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podUID="12eb62d0-8721-4482-b4a3-148a61cea029" Mar 14 05:49:19 crc kubenswrapper[4713]: E0314 05:49:19.591571 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podUID="12eb62d0-8721-4482-b4a3-148a61cea029" Mar 14 05:49:19 crc kubenswrapper[4713]: E0314 05:49:19.694331 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6" Mar 14 05:49:19 crc kubenswrapper[4713]: E0314 05:49:19.694469 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jj6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d9d6b584d-j587n_openstack-operators(6a712708-53c3-4854-9a45-3442ee780cdc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:19 crc kubenswrapper[4713]: E0314 05:49:19.695776 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" podUID="6a712708-53c3-4854-9a45-3442ee780cdc" Mar 14 05:49:20 crc kubenswrapper[4713]: E0314 05:49:20.596682 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" podUID="6a712708-53c3-4854-9a45-3442ee780cdc" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.416239 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.416675 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whf7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66d56f6ff4-j4474_openstack-operators(fa62dff3-1643-4e94-b31a-d56b21a2327d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.418635 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" podUID="fa62dff3-1643-4e94-b31a-d56b21a2327d" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.625944 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" podUID="fa62dff3-1643-4e94-b31a-d56b21a2327d" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.828768 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.828986 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wgrj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-57b484b4df-gbrmh_openstack-operators(ccd30d62-2e42-4399-b1eb-dfde3782dcb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:22 crc kubenswrapper[4713]: E0314 05:49:22.830165 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" podUID="ccd30d62-2e42-4399-b1eb-dfde3782dcb8" Mar 14 05:49:23 crc kubenswrapper[4713]: E0314 05:49:23.632714 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" podUID="ccd30d62-2e42-4399-b1eb-dfde3782dcb8" Mar 14 05:49:25 crc kubenswrapper[4713]: E0314 05:49:25.643595 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703" Mar 14 05:49:25 crc kubenswrapper[4713]: E0314 05:49:25.644156 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sw54d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5bc894d9b-6xshs_openstack-operators(9da309ad-34cc-4b06-b166-c571b5a39825): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:25 crc kubenswrapper[4713]: E0314 05:49:25.645750 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.079457 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.079787 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lgw4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-7t4g8_openstack-operators(d5c6be47-5c06-46e0-ae8c-87b7a3f23561): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.080999 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podUID="d5c6be47-5c06-46e0-ae8c-87b7a3f23561" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.545168 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.545353 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7hj48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-77b6666d85-xvmqf_openstack-operators(ca4c3a10-3f60-460b-ad21-258a757bf57c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.546464 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" podUID="ca4c3a10-3f60-460b-ad21-258a757bf57c" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.658579 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6\\\"\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" podUID="ca4c3a10-3f60-460b-ad21-258a757bf57c" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.658588 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podUID="d5c6be47-5c06-46e0-ae8c-87b7a3f23561" Mar 14 05:49:26 crc kubenswrapper[4713]: E0314 05:49:26.658725 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" Mar 14 05:49:28 crc kubenswrapper[4713]: E0314 05:49:28.525571 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f" Mar 14 05:49:28 crc kubenswrapper[4713]: E0314 05:49:28.525795 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztfvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-kv59d_openstack-operators(a55d0754-702d-4dbc-995a-b98d852678ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:28 crc kubenswrapper[4713]: E0314 05:49:28.526982 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" Mar 14 05:49:28 crc kubenswrapper[4713]: E0314 05:49:28.676093 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" Mar 14 05:49:29 crc kubenswrapper[4713]: E0314 05:49:29.481454 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5" Mar 14 05:49:29 crc kubenswrapper[4713]: E0314 05:49:29.481730 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2qzhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-2r2fz_openstack-operators(383e8493-0661-4b45-a72c-5851b520c65b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:29 crc kubenswrapper[4713]: E0314 05:49:29.483446 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" podUID="383e8493-0661-4b45-a72c-5851b520c65b" Mar 14 05:49:29 crc kubenswrapper[4713]: E0314 05:49:29.684157 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" podUID="383e8493-0661-4b45-a72c-5851b520c65b" Mar 14 05:49:31 crc kubenswrapper[4713]: E0314 05:49:31.146934 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 14 05:49:31 crc kubenswrapper[4713]: E0314 05:49:31.147426 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f5tg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-8mm88_openstack-operators(409a2a8b-7e66-4763-9698-3a909f051c50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:31 crc kubenswrapper[4713]: E0314 05:49:31.148647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" Mar 14 05:49:31 crc kubenswrapper[4713]: E0314 05:49:31.709825 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" Mar 14 05:49:32 crc kubenswrapper[4713]: E0314 05:49:32.499133 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 14 05:49:32 crc kubenswrapper[4713]: E0314 05:49:32.499359 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8fddr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mlh55_openstack-operators(9be88166-e4c0-464c-9dc0-a8a51595c555): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:49:32 crc kubenswrapper[4713]: E0314 05:49:32.500737 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" podUID="9be88166-e4c0-464c-9dc0-a8a51595c555" Mar 14 05:49:32 crc kubenswrapper[4713]: I0314 05:49:32.627814 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:32 crc kubenswrapper[4713]: I0314 05:49:32.637543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbc588fa-b052-4336-81fe-2fed809e251b-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-fm6fr\" (UID: \"cbc588fa-b052-4336-81fe-2fed809e251b\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:32 crc kubenswrapper[4713]: I0314 05:49:32.641153 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:32 crc kubenswrapper[4713]: E0314 05:49:32.753679 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" podUID="9be88166-e4c0-464c-9dc0-a8a51595c555" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.033876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.045931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4da1ed21-82a5-400c-a201-653fe58adf4c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2\" (UID: \"4da1ed21-82a5-400c-a201-653fe58adf4c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.126570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.302088 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr"] Mar 14 05:49:33 crc kubenswrapper[4713]: W0314 05:49:33.339666 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc588fa_b052_4336_81fe_2fed809e251b.slice/crio-5be2d4cb83c5b49a8a189359a99df55b567bddf3120f480bb627c3af57c24826 WatchSource:0}: Error finding container 5be2d4cb83c5b49a8a189359a99df55b567bddf3120f480bb627c3af57c24826: Status 404 returned error can't find the container with id 5be2d4cb83c5b49a8a189359a99df55b567bddf3120f480bb627c3af57c24826 Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.450621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.450714 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.461618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-metrics-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.467195 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92164fd9-b08c-4b00-975c-0fcdd245f8f9-webhook-certs\") pod \"openstack-operator-controller-manager-85d9999fbb-kdnkw\" (UID: \"92164fd9-b08c-4b00-975c-0fcdd245f8f9\") " pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.720223 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.727295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" event={"ID":"4101fac4-706c-4e2b-9203-102d0874c3ba","Type":"ContainerStarted","Data":"e422772cf2e9faf3efb8e8c3bb328df78c305adc27a33e06aac40f0ad0f4f7ed"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.727553 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.729310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" event={"ID":"12eb62d0-8721-4482-b4a3-148a61cea029","Type":"ContainerStarted","Data":"34062959514e02eeaa508da64ce7019c66f4d6c19e70b9d81878a18d7c1b1515"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.729811 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.730729 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" event={"ID":"cbc588fa-b052-4336-81fe-2fed809e251b","Type":"ContainerStarted","Data":"5be2d4cb83c5b49a8a189359a99df55b567bddf3120f480bb627c3af57c24826"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.733607 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" event={"ID":"149dd450-69f3-4d71-aac3-90052dcf2253","Type":"ContainerStarted","Data":"3c1917aa72816063692ddb403b69dbe2abec98c89c07f2fdc969fe50177e1bb8"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.733877 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.735687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" event={"ID":"46274028-feea-4c48-b086-44533fc3e996","Type":"ContainerStarted","Data":"efe26cf1d0335ba614e31cc5af33596eed5c917d9227c5db20aa99df61fcd168"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.735949 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.741670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" event={"ID":"3941b4bd-470d-4351-aed9-4bc1f90f9ad4","Type":"ContainerStarted","Data":"f82c8d7c2032dfcc9117a599f044a9b83e2a17662b5a28e6a01dfca3395a42ce"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.742078 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.745771 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" event={"ID":"bae008e7-4329-4d30-9820-81daf4300f96","Type":"ContainerStarted","Data":"ac55e7f036519c5da0f510095aa5492250d0f8974924f597a8f4fec2549838b9"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.745992 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.749902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" event={"ID":"fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd","Type":"ContainerStarted","Data":"4af2ba110bc9739af533c00d8381c24a89d9b893c5960d5668ccc0b7b49f497f"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.749928 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" podStartSLOduration=5.92623585 podStartE2EDuration="33.749911093s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.942852846 +0000 UTC m=+1328.030762146" lastFinishedPulling="2026-03-14 05:49:32.766528089 +0000 UTC m=+1355.854437389" observedRunningTime="2026-03-14 05:49:33.746420262 +0000 UTC m=+1356.834329572" watchObservedRunningTime="2026-03-14 05:49:33.749911093 +0000 UTC m=+1356.837820393" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.750264 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.752390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" event={"ID":"4128f2c6-d929-4815-8502-291baf22f24f","Type":"ContainerStarted","Data":"a6dc1cf166f69ad63a84628a658f04db510d2446dec9a060cb0db091436e7b19"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.752521 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.755753 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" event={"ID":"aa4ff369-f2af-439f-b9f6-2c8301e80210","Type":"ContainerStarted","Data":"13b400969fefb1f020b9e9dc6935b3981373fa25f7684a77f87fdc5e549a3a15"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.755942 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.760710 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" event={"ID":"50d43641-0638-4763-9123-0c0c2c76629e","Type":"ContainerStarted","Data":"036b2d566a8310ee8d8804b85b346c6f4e06a43a740b2d1c84f6500fe7ab4b80"} Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.761019 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.783683 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podStartSLOduration=4.189398622 podStartE2EDuration="33.783657439s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:03.506932639 +0000 UTC m=+1326.594841929" lastFinishedPulling="2026-03-14 05:49:33.101191436 +0000 UTC m=+1356.189100746" observedRunningTime="2026-03-14 05:49:33.779513326 +0000 UTC m=+1356.867422626" watchObservedRunningTime="2026-03-14 05:49:33.783657439 +0000 UTC m=+1356.871566739" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.843455 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2"] Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.851473 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podStartSLOduration=5.767048786 podStartE2EDuration="33.851448409s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.710986406 +0000 UTC m=+1327.798895706" lastFinishedPulling="2026-03-14 05:49:32.795386029 +0000 UTC m=+1355.883295329" observedRunningTime="2026-03-14 05:49:33.837110581 +0000 UTC m=+1356.925019901" watchObservedRunningTime="2026-03-14 05:49:33.851448409 +0000 UTC m=+1356.939357709" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.899784 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" podStartSLOduration=4.221781475 podStartE2EDuration="33.899762119s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:02.895532053 +0000 UTC m=+1325.983441353" lastFinishedPulling="2026-03-14 05:49:32.573512687 +0000 UTC m=+1355.661421997" observedRunningTime="2026-03-14 05:49:33.866926602 +0000 UTC m=+1356.954835912" watchObservedRunningTime="2026-03-14 05:49:33.899762119 +0000 UTC m=+1356.987671419" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.916926 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" podStartSLOduration=6.901736512 podStartE2EDuration="33.916907595s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.804367483 +0000 UTC m=+1327.892276783" lastFinishedPulling="2026-03-14 05:49:31.819538566 +0000 UTC m=+1354.907447866" observedRunningTime="2026-03-14 05:49:33.902096203 +0000 UTC m=+1356.990005503" watchObservedRunningTime="2026-03-14 05:49:33.916907595 +0000 UTC m=+1357.004816895" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.936071 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podStartSLOduration=4.080882172 podStartE2EDuration="33.936047535s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:02.913391661 +0000 UTC m=+1326.001300961" lastFinishedPulling="2026-03-14 05:49:32.768557024 +0000 UTC m=+1355.856466324" observedRunningTime="2026-03-14 05:49:33.925748707 +0000 UTC m=+1357.013658007" watchObservedRunningTime="2026-03-14 05:49:33.936047535 +0000 UTC m=+1357.023956835" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.956136 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" podStartSLOduration=5.152457438 podStartE2EDuration="33.956113715s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:02.263079435 +0000 UTC m=+1325.350988735" lastFinishedPulling="2026-03-14 05:49:31.066735712 +0000 UTC m=+1354.154645012" observedRunningTime="2026-03-14 05:49:33.954479633 +0000 UTC m=+1357.042388933" watchObservedRunningTime="2026-03-14 05:49:33.956113715 +0000 UTC m=+1357.044023015" Mar 14 05:49:33 crc kubenswrapper[4713]: I0314 05:49:33.984628 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podStartSLOduration=6.352460785 podStartE2EDuration="33.984607323s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.942928499 +0000 UTC m=+1328.030837799" lastFinishedPulling="2026-03-14 05:49:32.575075037 +0000 UTC m=+1355.662984337" observedRunningTime="2026-03-14 05:49:33.977957171 +0000 UTC m=+1357.065866471" watchObservedRunningTime="2026-03-14 05:49:33.984607323 +0000 UTC m=+1357.072516623" Mar 14 05:49:34 crc kubenswrapper[4713]: I0314 05:49:34.040524 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podStartSLOduration=6.255297688 podStartE2EDuration="34.040504285s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.682853579 +0000 UTC m=+1327.770762879" lastFinishedPulling="2026-03-14 05:49:32.468060176 +0000 UTC m=+1355.555969476" observedRunningTime="2026-03-14 05:49:34.015173647 +0000 UTC m=+1357.103082947" watchObservedRunningTime="2026-03-14 05:49:34.040504285 +0000 UTC m=+1357.128413575" Mar 14 05:49:34 crc kubenswrapper[4713]: I0314 05:49:34.041355 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podStartSLOduration=6.239494834 podStartE2EDuration="34.041348611s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.761681722 +0000 UTC m=+1327.849591022" lastFinishedPulling="2026-03-14 05:49:32.563535499 +0000 UTC m=+1355.651444799" observedRunningTime="2026-03-14 05:49:34.035718702 +0000 UTC m=+1357.123628062" watchObservedRunningTime="2026-03-14 05:49:34.041348611 +0000 UTC m=+1357.129257901" Mar 14 05:49:35 crc kubenswrapper[4713]: I0314 05:49:35.781735 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" event={"ID":"4da1ed21-82a5-400c-a201-653fe58adf4c","Type":"ContainerStarted","Data":"ae3d6672666745e0c37b5141ebec8982c349e4b3076c75438e8271fb373b6075"} Mar 14 05:49:36 crc kubenswrapper[4713]: W0314 05:49:36.114798 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92164fd9_b08c_4b00_975c_0fcdd245f8f9.slice/crio-9417e8bb013801159c741775aef73a7beef0286ee834a4bb986b5931b7e46d8b WatchSource:0}: Error finding container 9417e8bb013801159c741775aef73a7beef0286ee834a4bb986b5931b7e46d8b: Status 404 returned error can't find the container with id 9417e8bb013801159c741775aef73a7beef0286ee834a4bb986b5931b7e46d8b Mar 14 05:49:36 crc kubenswrapper[4713]: I0314 05:49:36.117985 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw"] Mar 14 05:49:36 crc kubenswrapper[4713]: I0314 05:49:36.793042 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" event={"ID":"92164fd9-b08c-4b00-975c-0fcdd245f8f9","Type":"ContainerStarted","Data":"b9bb85ceee9e6547d427be966df978024c1d8d425572dcb0dcf0573c050a4297"} Mar 14 05:49:36 crc kubenswrapper[4713]: I0314 05:49:36.793500 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" event={"ID":"92164fd9-b08c-4b00-975c-0fcdd245f8f9","Type":"ContainerStarted","Data":"9417e8bb013801159c741775aef73a7beef0286ee834a4bb986b5931b7e46d8b"} Mar 14 05:49:36 crc kubenswrapper[4713]: I0314 05:49:36.793516 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:36 crc kubenswrapper[4713]: I0314 05:49:36.824054 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" podStartSLOduration=35.824034114 podStartE2EDuration="35.824034114s" podCreationTimestamp="2026-03-14 05:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:49:36.819002983 +0000 UTC m=+1359.906912293" watchObservedRunningTime="2026-03-14 05:49:36.824034114 +0000 UTC m=+1359.911943414" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.801452 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" event={"ID":"6a712708-53c3-4854-9a45-3442ee780cdc","Type":"ContainerStarted","Data":"f3e8a58cf503e58568bcaafdea56f7bb258f4101d6fd3825963e51080e22897b"} Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.802932 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.805133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" event={"ID":"ccd30d62-2e42-4399-b1eb-dfde3782dcb8","Type":"ContainerStarted","Data":"b2d3a53236ef04cdd1aaf129f3d30e3d4b30d8280b2636200cc135de1ad92801"} Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.805346 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.806932 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" event={"ID":"fa62dff3-1643-4e94-b31a-d56b21a2327d","Type":"ContainerStarted","Data":"4a5d121b86a6cebe32334893ac58591869b1895e35ac0431262167762c88ce25"} Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.807500 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.824846 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" podStartSLOduration=4.173938579 podStartE2EDuration="37.824832042s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:03.472612296 +0000 UTC m=+1326.560521596" lastFinishedPulling="2026-03-14 05:49:37.123505769 +0000 UTC m=+1360.211415059" observedRunningTime="2026-03-14 05:49:37.824228473 +0000 UTC m=+1360.912137773" watchObservedRunningTime="2026-03-14 05:49:37.824832042 +0000 UTC m=+1360.912741342" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.849515 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" podStartSLOduration=5.847771949 podStartE2EDuration="37.849495528s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.940785521 +0000 UTC m=+1328.028694821" lastFinishedPulling="2026-03-14 05:49:36.9425091 +0000 UTC m=+1360.030418400" observedRunningTime="2026-03-14 05:49:37.842398492 +0000 UTC m=+1360.930307792" watchObservedRunningTime="2026-03-14 05:49:37.849495528 +0000 UTC m=+1360.937404818" Mar 14 05:49:37 crc kubenswrapper[4713]: I0314 05:49:37.862470 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" podStartSLOduration=3.775729368 podStartE2EDuration="37.862451792s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:02.856040864 +0000 UTC m=+1325.943950164" lastFinishedPulling="2026-03-14 05:49:36.942763288 +0000 UTC m=+1360.030672588" observedRunningTime="2026-03-14 05:49:37.85770182 +0000 UTC m=+1360.945611120" watchObservedRunningTime="2026-03-14 05:49:37.862451792 +0000 UTC m=+1360.950361092" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.615754 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.616331 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.708583 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.843857 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" event={"ID":"cbc588fa-b052-4336-81fe-2fed809e251b","Type":"ContainerStarted","Data":"43849f1262190383ff5c0b9e4d01b17e08471ceda91baa9b676573c1d5ceea8d"} Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.844258 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.851376 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" event={"ID":"4da1ed21-82a5-400c-a201-653fe58adf4c","Type":"ContainerStarted","Data":"4afb028a2a658214367add9cd913baceeb2616568cfa00abdffa01cfa85e321c"} Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.851549 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.853322 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" event={"ID":"d5c6be47-5c06-46e0-ae8c-87b7a3f23561","Type":"ContainerStarted","Data":"004334f017621f5ff76750eb248ef658c45fb43e42f2515ec1ffc6f9561b94df"} Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.853917 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.858862 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" event={"ID":"9da309ad-34cc-4b06-b166-c571b5a39825","Type":"ContainerStarted","Data":"213b3f4dc8a4d4a8cbfbd06a4bc4b9fe1d569ae71017df9e091ff1aa180c6f0b"} Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.860071 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.866049 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" event={"ID":"ca4c3a10-3f60-460b-ad21-258a757bf57c","Type":"ContainerStarted","Data":"1bbe7dd138abd63c348187f086c322ca587854a8511578d1eb2051f871ac0396"} Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.867002 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.872187 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podStartSLOduration=34.100513768 podStartE2EDuration="40.87216068s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:33.357117283 +0000 UTC m=+1356.445026593" lastFinishedPulling="2026-03-14 05:49:40.128764195 +0000 UTC m=+1363.216673505" observedRunningTime="2026-03-14 05:49:40.862891594 +0000 UTC m=+1363.950800914" watchObservedRunningTime="2026-03-14 05:49:40.87216068 +0000 UTC m=+1363.960069980" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.893981 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podStartSLOduration=5.545747133 podStartE2EDuration="40.893963715s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.781746422 +0000 UTC m=+1327.869655712" lastFinishedPulling="2026-03-14 05:49:40.129962994 +0000 UTC m=+1363.217872294" observedRunningTime="2026-03-14 05:49:40.881451635 +0000 UTC m=+1363.969360945" watchObservedRunningTime="2026-03-14 05:49:40.893963715 +0000 UTC m=+1363.981873015" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.901912 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podStartSLOduration=5.575975405 podStartE2EDuration="40.901896547s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.803388681 +0000 UTC m=+1327.891297991" lastFinishedPulling="2026-03-14 05:49:40.129309833 +0000 UTC m=+1363.217219133" observedRunningTime="2026-03-14 05:49:40.895614297 +0000 UTC m=+1363.983523607" watchObservedRunningTime="2026-03-14 05:49:40.901896547 +0000 UTC m=+1363.989805847" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.932712 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podStartSLOduration=36.466675194 podStartE2EDuration="40.932694539s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:35.663335049 +0000 UTC m=+1358.751244339" lastFinishedPulling="2026-03-14 05:49:40.129354384 +0000 UTC m=+1363.217263684" observedRunningTime="2026-03-14 05:49:40.932258895 +0000 UTC m=+1364.020168185" watchObservedRunningTime="2026-03-14 05:49:40.932694539 +0000 UTC m=+1364.020603839" Mar 14 05:49:40 crc kubenswrapper[4713]: I0314 05:49:40.954323 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" podStartSLOduration=3.697986578 podStartE2EDuration="40.954304647s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:02.875436432 +0000 UTC m=+1325.963345732" lastFinishedPulling="2026-03-14 05:49:40.131754501 +0000 UTC m=+1363.219663801" observedRunningTime="2026-03-14 05:49:40.951226609 +0000 UTC m=+1364.039135929" watchObservedRunningTime="2026-03-14 05:49:40.954304647 +0000 UTC m=+1364.042213947" Mar 14 05:49:41 crc kubenswrapper[4713]: I0314 05:49:41.160706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" Mar 14 05:49:41 crc kubenswrapper[4713]: I0314 05:49:41.212501 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" Mar 14 05:49:41 crc kubenswrapper[4713]: I0314 05:49:41.334722 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 05:49:41 crc kubenswrapper[4713]: I0314 05:49:41.587523 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.056095 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.116138 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.206462 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.884616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" event={"ID":"a55d0754-702d-4dbc-995a-b98d852678ce","Type":"ContainerStarted","Data":"6dcf3852a97d3aaf4642f752e2400f51a6559ecd8b7862d5f1e2440663c44ffd"} Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.885194 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:42 crc kubenswrapper[4713]: I0314 05:49:42.908851 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podStartSLOduration=5.356063747 podStartE2EDuration="42.908832324s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.769980837 +0000 UTC m=+1327.857890127" lastFinishedPulling="2026-03-14 05:49:42.322749404 +0000 UTC m=+1365.410658704" observedRunningTime="2026-03-14 05:49:42.90145463 +0000 UTC m=+1365.989363940" watchObservedRunningTime="2026-03-14 05:49:42.908832324 +0000 UTC m=+1365.996741624" Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.730626 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.894593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" event={"ID":"383e8493-0661-4b45-a72c-5851b520c65b","Type":"ContainerStarted","Data":"875a0e0781b481ec425f56e42692e41e41416c6d80be10a5b5c31c49a83a87b9"} Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.895053 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.896499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" event={"ID":"409a2a8b-7e66-4763-9698-3a909f051c50","Type":"ContainerStarted","Data":"c297a3d67f12c0b9c06fc0019b121e5d526fb655e3d1f9b3bf1da03498020cd4"} Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.896834 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.927570 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" podStartSLOduration=5.688637707 podStartE2EDuration="43.927548024s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.765091521 +0000 UTC m=+1327.853000821" lastFinishedPulling="2026-03-14 05:49:43.004001838 +0000 UTC m=+1366.091911138" observedRunningTime="2026-03-14 05:49:43.918396402 +0000 UTC m=+1367.006305702" watchObservedRunningTime="2026-03-14 05:49:43.927548024 +0000 UTC m=+1367.015457324" Mar 14 05:49:43 crc kubenswrapper[4713]: I0314 05:49:43.941544 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podStartSLOduration=5.653574258 podStartE2EDuration="43.941523099s" podCreationTimestamp="2026-03-14 05:49:00 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.797588876 +0000 UTC m=+1327.885498176" lastFinishedPulling="2026-03-14 05:49:43.085537717 +0000 UTC m=+1366.173447017" observedRunningTime="2026-03-14 05:49:43.935470396 +0000 UTC m=+1367.023379696" watchObservedRunningTime="2026-03-14 05:49:43.941523099 +0000 UTC m=+1367.029432399" Mar 14 05:49:48 crc kubenswrapper[4713]: I0314 05:49:48.951868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" event={"ID":"9be88166-e4c0-464c-9dc0-a8a51595c555","Type":"ContainerStarted","Data":"c97da70cec37c00d2f85759f57ae7b6ca52d168088de4ac5e6ad080a05aa8844"} Mar 14 05:49:48 crc kubenswrapper[4713]: I0314 05:49:48.978149 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlh55" podStartSLOduration=4.96358457 podStartE2EDuration="47.97812712s" podCreationTimestamp="2026-03-14 05:49:01 +0000 UTC" firstStartedPulling="2026-03-14 05:49:04.940390718 +0000 UTC m=+1328.028300018" lastFinishedPulling="2026-03-14 05:49:47.954933268 +0000 UTC m=+1371.042842568" observedRunningTime="2026-03-14 05:49:48.970690234 +0000 UTC m=+1372.058599544" watchObservedRunningTime="2026-03-14 05:49:48.97812712 +0000 UTC m=+1372.066036440" Mar 14 05:49:50 crc kubenswrapper[4713]: I0314 05:49:50.645629 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" Mar 14 05:49:50 crc kubenswrapper[4713]: I0314 05:49:50.745310 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xvmqf" Mar 14 05:49:50 crc kubenswrapper[4713]: I0314 05:49:50.773852 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-j587n" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.132532 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.186840 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-gbrmh" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.286827 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.315994 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.591197 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" Mar 14 05:49:51 crc kubenswrapper[4713]: I0314 05:49:51.898516 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" Mar 14 05:49:52 crc kubenswrapper[4713]: I0314 05:49:52.647529 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" Mar 14 05:49:53 crc kubenswrapper[4713]: I0314 05:49:53.133750 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.155424 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557790-mlscf"] Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.157362 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.162620 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.163012 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.163375 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.167156 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557790-mlscf"] Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.312027 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vtp\" (UniqueName: \"kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp\") pod \"auto-csr-approver-29557790-mlscf\" (UID: \"ac3b651c-ff92-4369-9bec-522a5c7c9aba\") " pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.414285 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vtp\" (UniqueName: \"kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp\") pod \"auto-csr-approver-29557790-mlscf\" (UID: \"ac3b651c-ff92-4369-9bec-522a5c7c9aba\") " pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.432549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vtp\" (UniqueName: \"kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp\") pod \"auto-csr-approver-29557790-mlscf\" (UID: \"ac3b651c-ff92-4369-9bec-522a5c7c9aba\") " pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.477589 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:00 crc kubenswrapper[4713]: I0314 05:50:00.917195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557790-mlscf"] Mar 14 05:50:00 crc kubenswrapper[4713]: W0314 05:50:00.936303 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac3b651c_ff92_4369_9bec_522a5c7c9aba.slice/crio-3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0 WatchSource:0}: Error finding container 3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0: Status 404 returned error can't find the container with id 3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0 Mar 14 05:50:01 crc kubenswrapper[4713]: I0314 05:50:01.047501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557790-mlscf" event={"ID":"ac3b651c-ff92-4369-9bec-522a5c7c9aba","Type":"ContainerStarted","Data":"3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0"} Mar 14 05:50:03 crc kubenswrapper[4713]: I0314 05:50:03.067168 4713 generic.go:334] "Generic (PLEG): container finished" podID="ac3b651c-ff92-4369-9bec-522a5c7c9aba" containerID="29c3fe2cc97c8cd4221c05d4eece2a38d0a377297076d6d859c628e0f71b4bec" exitCode=0 Mar 14 05:50:03 crc kubenswrapper[4713]: I0314 05:50:03.067275 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557790-mlscf" event={"ID":"ac3b651c-ff92-4369-9bec-522a5c7c9aba","Type":"ContainerDied","Data":"29c3fe2cc97c8cd4221c05d4eece2a38d0a377297076d6d859c628e0f71b4bec"} Mar 14 05:50:04 crc kubenswrapper[4713]: I0314 05:50:04.412986 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:04 crc kubenswrapper[4713]: I0314 05:50:04.486574 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2vtp\" (UniqueName: \"kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp\") pod \"ac3b651c-ff92-4369-9bec-522a5c7c9aba\" (UID: \"ac3b651c-ff92-4369-9bec-522a5c7c9aba\") " Mar 14 05:50:04 crc kubenswrapper[4713]: I0314 05:50:04.492399 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp" (OuterVolumeSpecName: "kube-api-access-z2vtp") pod "ac3b651c-ff92-4369-9bec-522a5c7c9aba" (UID: "ac3b651c-ff92-4369-9bec-522a5c7c9aba"). InnerVolumeSpecName "kube-api-access-z2vtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:04 crc kubenswrapper[4713]: I0314 05:50:04.588829 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2vtp\" (UniqueName: \"kubernetes.io/projected/ac3b651c-ff92-4369-9bec-522a5c7c9aba-kube-api-access-z2vtp\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.091677 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557790-mlscf" event={"ID":"ac3b651c-ff92-4369-9bec-522a5c7c9aba","Type":"ContainerDied","Data":"3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0"} Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.092005 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9e9990ef23acf52affe06018a03922efc244184a848518607eda34ee8f18f0" Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.091752 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557790-mlscf" Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.486822 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557784-hq5bm"] Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.493841 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557784-hq5bm"] Mar 14 05:50:05 crc kubenswrapper[4713]: I0314 05:50:05.575424 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f430a1b-bca2-4fe3-8f22-d83f1ed50e16" path="/var/lib/kubelet/pods/3f430a1b-bca2-4fe3-8f22-d83f1ed50e16/volumes" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.704137 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:10 crc kubenswrapper[4713]: E0314 05:50:10.705469 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3b651c-ff92-4369-9bec-522a5c7c9aba" containerName="oc" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.705482 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3b651c-ff92-4369-9bec-522a5c7c9aba" containerName="oc" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.705658 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3b651c-ff92-4369-9bec-522a5c7c9aba" containerName="oc" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.706820 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.708597 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.711470 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8kldm" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.711722 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.715916 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.736163 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.785644 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.787970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.790121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.806920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.807050 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wb5\" (UniqueName: \"kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.808162 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.909815 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.909954 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.909989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wb5\" (UniqueName: \"kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.910035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.910104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4gj5\" (UniqueName: \"kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.910771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:10 crc kubenswrapper[4713]: I0314 05:50:10.941345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wb5\" (UniqueName: \"kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5\") pod \"dnsmasq-dns-675f4bcbfc-q8z8q\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.011511 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4gj5\" (UniqueName: \"kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.011680 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.011732 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.012799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.012815 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.031158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4gj5\" (UniqueName: \"kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5\") pod \"dnsmasq-dns-78dd6ddcc-mr52c\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.058643 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.100965 4713 scope.go:117] "RemoveContainer" containerID="241b1ad131aaa47ac2c35ca42a749cc53027b02890f5488c3934371c9fa7dbbd" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.109419 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.504335 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:11 crc kubenswrapper[4713]: I0314 05:50:11.606873 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:12 crc kubenswrapper[4713]: I0314 05:50:12.171558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" event={"ID":"f9fb1255-a854-45aa-8f7b-1cac97290ec4","Type":"ContainerStarted","Data":"e39f7cadad07fe5e8e94994912a2f6d27f4e218d56b214e37171bf54c3d600c5"} Mar 14 05:50:12 crc kubenswrapper[4713]: I0314 05:50:12.174543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" event={"ID":"19474f89-6d04-494d-93d5-235f873cfcdb","Type":"ContainerStarted","Data":"22f662f2e4262456a15c9487fda76d219bf80076a89f5112e04e6eaad0cdf077"} Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.726853 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.747495 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.750735 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.785861 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.878400 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724zc\" (UniqueName: \"kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.878457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.878606 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.981329 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.981418 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-724zc\" (UniqueName: \"kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.981462 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.982250 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:13 crc kubenswrapper[4713]: I0314 05:50:13.982787 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.021887 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-724zc\" (UniqueName: \"kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc\") pod \"dnsmasq-dns-5ccc8479f9-bq2gv\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.060792 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.074644 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.095698 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.098568 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.127306 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.190135 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.190236 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.190297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.291302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.291679 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.291758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.292686 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.293586 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.317606 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z\") pod \"dnsmasq-dns-57d769cc4f-jg97k\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.423771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.686888 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:14 crc kubenswrapper[4713]: W0314 05:50:14.723557 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58e4241d_f7c9_4c22_9f7a_9391a358eda4.slice/crio-f003af921eb31ffc2d3a291a166b2edfa02caddc45c55b3c6a350794a67626b8 WatchSource:0}: Error finding container f003af921eb31ffc2d3a291a166b2edfa02caddc45c55b3c6a350794a67626b8: Status 404 returned error can't find the container with id f003af921eb31ffc2d3a291a166b2edfa02caddc45c55b3c6a350794a67626b8 Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.940300 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.947546 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.952803 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.952901 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.953082 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.953177 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.953185 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.953280 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6t2qm" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.953353 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 05:50:14 crc kubenswrapper[4713]: I0314 05:50:14.977562 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:50:15 crc kubenswrapper[4713]: W0314 05:50:15.023441 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4feb0fd5_d952_4323_821e_187c23e16463.slice/crio-352262f82231a0a19c0ac6d9c1659d4556edf977d3a059b82bbb7c03d25b83d8 WatchSource:0}: Error finding container 352262f82231a0a19c0ac6d9c1659d4556edf977d3a059b82bbb7c03d25b83d8: Status 404 returned error can't find the container with id 352262f82231a0a19c0ac6d9c1659d4556edf977d3a059b82bbb7c03d25b83d8 Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.037944 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107269 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107423 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107473 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107586 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107769 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107854 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7rh\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107877 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.107970 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.108082 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209661 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209694 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209710 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7rh\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.209734 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.210670 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.211141 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.211470 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.211483 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.211561 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.217121 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.217515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.222147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.224559 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.225102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" event={"ID":"4feb0fd5-d952-4323-821e-187c23e16463","Type":"ContainerStarted","Data":"352262f82231a0a19c0ac6d9c1659d4556edf977d3a059b82bbb7c03d25b83d8"} Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.225942 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.225967 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d9e5f92b3791c958b2470b172a1fa1218f1b4f5595f643eb3c99247f0499ec82/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.226816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" event={"ID":"58e4241d-f7c9-4c22-9f7a-9391a358eda4","Type":"ContainerStarted","Data":"f003af921eb31ffc2d3a291a166b2edfa02caddc45c55b3c6a350794a67626b8"} Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.231755 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7rh\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.265364 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.282268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.285935 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.286437 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.286587 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.290793 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.291372 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8ztqd" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.291563 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.296915 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.311765 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.352400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.364351 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.366430 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.381456 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.384029 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.402074 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.417860 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcxv8\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.417967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418021 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418070 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418136 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418258 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418322 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.418416 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.449219 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcxv8\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519846 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519970 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.519987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520009 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520028 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520055 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520071 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520095 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520166 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.520253 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521167 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521261 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521338 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521369 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521403 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521449 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521473 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521542 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521605 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521627 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521695 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521721 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.521733 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.522563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.522706 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.522884 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.524177 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w52g\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.524250 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.524294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxjq\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.539409 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.542415 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.542484 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b6693a899cd4d176d3943cdc406828cdb93f1723f69ef2bee24905b0841bbd1d/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.542553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.542574 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.542746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.543610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.554314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcxv8\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.562751 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.564428 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.585279 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.589043 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626172 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626275 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626328 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626448 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626510 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626551 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626594 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626743 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626770 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626810 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626837 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626915 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626943 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w52g\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626970 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.626994 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxjq\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.627695 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.628746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.633317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.634544 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.634708 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.636493 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.637076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.637156 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.637417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.637529 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.638435 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.639581 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.639735 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.646273 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.646741 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.651849 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.651881 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81180cff6c7d54037a275b5eec5606984e9d96ef07a76c39a39ebf70b61a2c81/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.652029 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.652065 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a35ef7d69b40ea4acd6c23ae4e052713a9902ba943c7b732213f20ace37b1946/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.654627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.654640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.656336 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxjq\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.657585 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.658161 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w52g\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.658496 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.698089 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.708419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " pod="openstack/rabbitmq-server-1" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.724517 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:50:15 crc kubenswrapper[4713]: I0314 05:50:15.996327 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.187712 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.196642 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.205036 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.205526 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.205651 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zx8v7" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.205707 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.224519 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.225994 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.229626 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262290 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-generated\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262345 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-operator-scripts\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-default\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262388 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-kolla-config\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262455 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262604 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7zh\" (UniqueName: \"kubernetes.io/projected/051e4d6d-86dc-479f-a659-6f95b7baa817-kube-api-access-vl7zh\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.262642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e819564d-7125-420e-a6a8-df1056c3affd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e819564d-7125-420e-a6a8-df1056c3affd\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.338429 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.357492 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.365950 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7zh\" (UniqueName: \"kubernetes.io/projected/051e4d6d-86dc-479f-a659-6f95b7baa817-kube-api-access-vl7zh\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366090 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e819564d-7125-420e-a6a8-df1056c3affd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e819564d-7125-420e-a6a8-df1056c3affd\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366172 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-generated\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366194 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-operator-scripts\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-default\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-kolla-config\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.366389 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.367996 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-generated\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.368619 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-config-data-default\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.370317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-kolla-config\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.370403 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051e4d6d-86dc-479f-a659-6f95b7baa817-operator-scripts\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.372170 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.372434 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e819564d-7125-420e-a6a8-df1056c3affd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e819564d-7125-420e-a6a8-df1056c3affd\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fa56db469d7105f322817f7218836d9e238f440aeaea13a1eea896accf4e1a29/globalmount\"" pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.373612 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.376032 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051e4d6d-86dc-479f-a659-6f95b7baa817-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.384951 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7zh\" (UniqueName: \"kubernetes.io/projected/051e4d6d-86dc-479f-a659-6f95b7baa817-kube-api-access-vl7zh\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.421451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e819564d-7125-420e-a6a8-df1056c3affd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e819564d-7125-420e-a6a8-df1056c3affd\") pod \"openstack-galera-0\" (UID: \"051e4d6d-86dc-479f-a659-6f95b7baa817\") " pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.538511 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 05:50:16 crc kubenswrapper[4713]: I0314 05:50:16.651521 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:50:16 crc kubenswrapper[4713]: W0314 05:50:16.666565 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fef26f_0e1b_4e81_8969_a4b972708cb3.slice/crio-1dca3daa524affe675e6a88e697e16dfbd60477868421a0126c4ac08a1f13492 WatchSource:0}: Error finding container 1dca3daa524affe675e6a88e697e16dfbd60477868421a0126c4ac08a1f13492: Status 404 returned error can't find the container with id 1dca3daa524affe675e6a88e697e16dfbd60477868421a0126c4ac08a1f13492 Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.134141 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 05:50:17 crc kubenswrapper[4713]: W0314 05:50:17.142866 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod051e4d6d_86dc_479f_a659_6f95b7baa817.slice/crio-4f063f44945e7d4ffdc911b17be36182919bb132cfb91010b00d392a67ba6a86 WatchSource:0}: Error finding container 4f063f44945e7d4ffdc911b17be36182919bb132cfb91010b00d392a67ba6a86: Status 404 returned error can't find the container with id 4f063f44945e7d4ffdc911b17be36182919bb132cfb91010b00d392a67ba6a86 Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.296026 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerStarted","Data":"1dca3daa524affe675e6a88e697e16dfbd60477868421a0126c4ac08a1f13492"} Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.298786 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerStarted","Data":"b4f34d1e9cab71f50b8ab398225afc470fea2842bba9479d691c4400fa71f068"} Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.302361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerStarted","Data":"4f063f44945e7d4ffdc911b17be36182919bb132cfb91010b00d392a67ba6a86"} Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.304012 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerStarted","Data":"f82af143d02fd270cb62c42baceaebe3e8073dee67157a598f5ab0c4c9ac9b18"} Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.305947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerStarted","Data":"79d4b290a60238e60d2f1acec0af0949e417f1dfe7b520d5f2570d13f9b0e7fd"} Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.524857 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.527324 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.530932 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-x9dp5" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.531124 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.531269 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.533129 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.558855 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704389 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704490 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704621 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6g5z\" (UniqueName: \"kubernetes.io/projected/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kube-api-access-h6g5z\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704843 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704914 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.704947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.807829 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6g5z\" (UniqueName: \"kubernetes.io/projected/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kube-api-access-h6g5z\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.807899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.807978 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808074 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808111 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808244 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.808850 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.809983 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.811184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.817154 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.817346 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.818047 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.819398 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.819469 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f0951f5338efcf59d4749848457a9d4de6b25e18ac1a6ff256d1804b4ff65c3f/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.827744 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.833095 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.839159 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hcj6r" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.839724 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.839764 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.845138 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6g5z\" (UniqueName: \"kubernetes.io/projected/02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b-kube-api-access-h6g5z\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.902455 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 05:50:17 crc kubenswrapper[4713]: I0314 05:50:17.908632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84fb4d6-a846-4ac3-bb03-a8d54af5b628\") pod \"openstack-cell1-galera-0\" (UID: \"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b\") " pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.020821 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtqp9\" (UniqueName: \"kubernetes.io/projected/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kube-api-access-vtqp9\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.020901 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.020937 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kolla-config\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.021051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.021117 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-config-data\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.124183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-config-data\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.125049 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqp9\" (UniqueName: \"kubernetes.io/projected/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kube-api-access-vtqp9\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.125127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.125166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kolla-config\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.125343 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.128521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kolla-config\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.128576 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-config-data\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.139701 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.149458 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqp9\" (UniqueName: \"kubernetes.io/projected/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-kube-api-access-vtqp9\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.150280 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c32295a7-c2a9-461b-a4eb-9d9eeb2fc645-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645\") " pod="openstack/memcached-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.160995 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:18 crc kubenswrapper[4713]: I0314 05:50:18.231185 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.222759 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.224131 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.239706 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pp5gr" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.243840 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.373522 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwgkg\" (UniqueName: \"kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg\") pod \"kube-state-metrics-0\" (UID: \"f04b986f-f5da-4458-93bc-c093c0f8a24b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.475813 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwgkg\" (UniqueName: \"kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg\") pod \"kube-state-metrics-0\" (UID: \"f04b986f-f5da-4458-93bc-c093c0f8a24b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.536045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwgkg\" (UniqueName: \"kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg\") pod \"kube-state-metrics-0\" (UID: \"f04b986f-f5da-4458-93bc-c093c0f8a24b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:50:20 crc kubenswrapper[4713]: I0314 05:50:20.547139 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.144407 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.148479 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.152695 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-fs9vg" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.152937 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.155519 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.295451 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9p9d\" (UniqueName: \"kubernetes.io/projected/265f475a-c481-4f65-a176-b3d4c55c691d-kube-api-access-b9p9d\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.295495 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.397016 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9p9d\" (UniqueName: \"kubernetes.io/projected/265f475a-c481-4f65-a176-b3d4c55c691d-kube-api-access-b9p9d\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.397075 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: E0314 05:50:21.397329 4713 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 14 05:50:21 crc kubenswrapper[4713]: E0314 05:50:21.397417 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert podName:265f475a-c481-4f65-a176-b3d4c55c691d nodeName:}" failed. No retries permitted until 2026-03-14 05:50:21.897364258 +0000 UTC m=+1404.985273548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert") pod "observability-ui-dashboards-66cbf594b5-hj46d" (UID: "265f475a-c481-4f65-a176-b3d4c55c691d") : secret "observability-ui-dashboards" not found Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.431558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9p9d\" (UniqueName: \"kubernetes.io/projected/265f475a-c481-4f65-a176-b3d4c55c691d-kube-api-access-b9p9d\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.516479 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79df5895fd-4nxm5"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.534143 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.534784 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.552012 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.562896 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.564899 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.565854 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.566105 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.566407 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-pfd5j" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.566558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.566696 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.568080 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.604279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79df5895fd-4nxm5"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.604314 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712420 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-trusted-ca-bundle\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712723 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712795 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712814 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712851 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-service-ca\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-console-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkp8w\" (UniqueName: \"kubernetes.io/projected/635fd3d7-4984-4dae-9416-068a4d020d75-kube-api-access-wkp8w\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-oauth-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.712987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6p7\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713031 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713048 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-oauth-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713073 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713108 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.713150 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-trusted-ca-bundle\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815192 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815226 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815256 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815275 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-service-ca\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815355 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-console-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815373 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkp8w\" (UniqueName: \"kubernetes.io/projected/635fd3d7-4984-4dae-9416-068a4d020d75-kube-api-access-wkp8w\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-oauth-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6p7\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815497 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-oauth-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.815553 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.816035 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.816479 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-trusted-ca-bundle\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.816854 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-console-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.816911 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.817114 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.817592 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-service-ca\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.820961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.822163 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/635fd3d7-4984-4dae-9416-068a4d020d75-oauth-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.827515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-serving-cert\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.827621 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.827782 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.828377 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.828400 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3ca1519da4cb0118aff35a94efbea77d8a6bbedb0ea01472a96864ab8cceb7b7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.831767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.832827 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/635fd3d7-4984-4dae-9416-068a4d020d75-console-oauth-config\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.833841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.840187 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6p7\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.840537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkp8w\" (UniqueName: \"kubernetes.io/projected/635fd3d7-4984-4dae-9416-068a4d020d75-kube-api-access-wkp8w\") pod \"console-79df5895fd-4nxm5\" (UID: \"635fd3d7-4984-4dae-9416-068a4d020d75\") " pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.864107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.916881 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.920246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/265f475a-c481-4f65-a176-b3d4c55c691d-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hj46d\" (UID: \"265f475a-c481-4f65-a176-b3d4c55c691d\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.923017 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:21 crc kubenswrapper[4713]: I0314 05:50:21.939736 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:50:22 crc kubenswrapper[4713]: I0314 05:50:22.088019 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.661634 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lk79w"] Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.663448 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.665648 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.666567 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.669866 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lk79w"] Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.671339 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-clhqb" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.733225 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6pzwm"] Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.738011 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.746658 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6pzwm"] Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760337 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-ovn-controller-tls-certs\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760418 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-combined-ca-bundle\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-log-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-scripts\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760515 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bjqv\" (UniqueName: \"kubernetes.io/projected/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-kube-api-access-9bjqv\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.760538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862214 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1475dc78-ec5d-45b6-a21d-0e6e6320a012-scripts\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-ovn-controller-tls-certs\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-lib\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-combined-ca-bundle\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862422 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-run\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862439 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-log-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-scripts\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bjqv\" (UniqueName: \"kubernetes.io/projected/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-kube-api-access-9bjqv\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862492 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxrd\" (UniqueName: \"kubernetes.io/projected/1475dc78-ec5d-45b6-a21d-0e6e6320a012-kube-api-access-nbxrd\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862507 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-etc-ovs\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.862587 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-log\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.863087 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.865196 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-log-ovn\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.865461 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-var-run\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.867439 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-scripts\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.877800 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-combined-ca-bundle\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.879306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-ovn-controller-tls-certs\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.885865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bjqv\" (UniqueName: \"kubernetes.io/projected/2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca-kube-api-access-9bjqv\") pod \"ovn-controller-lk79w\" (UID: \"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca\") " pod="openstack/ovn-controller-lk79w" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.969771 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-lib\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.970002 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-run\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.970071 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxrd\" (UniqueName: \"kubernetes.io/projected/1475dc78-ec5d-45b6-a21d-0e6e6320a012-kube-api-access-nbxrd\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.970116 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-etc-ovs\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.970297 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-log\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.970412 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1475dc78-ec5d-45b6-a21d-0e6e6320a012-scripts\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.973025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1475dc78-ec5d-45b6-a21d-0e6e6320a012-scripts\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.973226 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-lib\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.973294 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-run\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.973849 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-etc-ovs\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.973961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1475dc78-ec5d-45b6-a21d-0e6e6320a012-var-log\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:23 crc kubenswrapper[4713]: I0314 05:50:23.999828 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.010497 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxrd\" (UniqueName: \"kubernetes.io/projected/1475dc78-ec5d-45b6-a21d-0e6e6320a012-kube-api-access-nbxrd\") pod \"ovn-controller-ovs-6pzwm\" (UID: \"1475dc78-ec5d-45b6-a21d-0e6e6320a012\") " pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.092007 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.502676 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.504668 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.511593 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2zlrn" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.511663 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.511765 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.511820 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.511928 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.543288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.586679 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.586966 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587332 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587689 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587786 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.587923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbkr\" (UniqueName: \"kubernetes.io/projected/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-kube-api-access-mmbkr\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbkr\" (UniqueName: \"kubernetes.io/projected/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-kube-api-access-mmbkr\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690320 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690687 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690826 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.690930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.694440 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.694759 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.695142 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.696225 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.697188 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.697270 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af88b169a1ddf5ca4664c2b4fce31124358a5ce1e51c8d66505318c34172cdc5/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.698084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.699438 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.719913 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbkr\" (UniqueName: \"kubernetes.io/projected/d30edab6-1aa0-47d8-a20d-d2d2d0d6185d-kube-api-access-mmbkr\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.739722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31234f2e-eef6-461d-83ff-4367a0ed5dfc\") pod \"ovsdbserver-nb-0\" (UID: \"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d\") " pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:24 crc kubenswrapper[4713]: I0314 05:50:24.847118 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.027274 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.029902 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.035191 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jtxbd" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.035339 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.037822 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.038268 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.043806 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvkfn\" (UniqueName: \"kubernetes.io/projected/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-kube-api-access-mvkfn\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156799 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.156948 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.157000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.157023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259490 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvkfn\" (UniqueName: \"kubernetes.io/projected/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-kube-api-access-mvkfn\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259556 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259651 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259731 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259776 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259801 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.259859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.260494 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.260787 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.261113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.267582 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.267730 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.267800 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2580ddf911a2d13ad12ceef034afa6c6fabc24b614d9225fa9adc19b35ad1c1/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.271237 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.277567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.283484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvkfn\" (UniqueName: \"kubernetes.io/projected/92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7-kube-api-access-mvkfn\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.302939 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c0223c4-c99e-4cf6-95dc-4ef6137ca0a6\") pod \"ovsdbserver-sb-0\" (UID: \"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7\") " pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:27 crc kubenswrapper[4713]: I0314 05:50:27.361568 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:30 crc kubenswrapper[4713]: E0314 05:50:30.993015 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 05:50:30 crc kubenswrapper[4713]: E0314 05:50:30.994065 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4gj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-mr52c_openstack(19474f89-6d04-494d-93d5-235f873cfcdb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:50:30 crc kubenswrapper[4713]: E0314 05:50:30.995367 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" podUID="19474f89-6d04-494d-93d5-235f873cfcdb" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.227818 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.228264 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2wb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-q8z8q_openstack(f9fb1255-a854-45aa-8f7b-1cac97290ec4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.229351 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" podUID="f9fb1255-a854-45aa-8f7b-1cac97290ec4" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.233578 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.233893 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-724zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-bq2gv_openstack(58e4241d-f7c9-4c22-9f7a-9391a358eda4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.235069 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" podUID="58e4241d-f7c9-4c22-9f7a-9391a358eda4" Mar 14 05:50:32 crc kubenswrapper[4713]: I0314 05:50:32.482178 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:50:32 crc kubenswrapper[4713]: E0314 05:50:32.508154 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" podUID="58e4241d-f7c9-4c22-9f7a-9391a358eda4" Mar 14 05:50:34 crc kubenswrapper[4713]: W0314 05:50:34.254675 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eea30f2_9f63_4f93_a711_48fee1d631c2.slice/crio-0fbe9df95816c6cff2bb12345300d473a4691a08c6e77ebeb415471f3c6a2955 WatchSource:0}: Error finding container 0fbe9df95816c6cff2bb12345300d473a4691a08c6e77ebeb415471f3c6a2955: Status 404 returned error can't find the container with id 0fbe9df95816c6cff2bb12345300d473a4691a08c6e77ebeb415471f3c6a2955 Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.540415 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.543804 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.575716 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" event={"ID":"f9fb1255-a854-45aa-8f7b-1cac97290ec4","Type":"ContainerDied","Data":"e39f7cadad07fe5e8e94994912a2f6d27f4e218d56b214e37171bf54c3d600c5"} Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.576006 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q8z8q" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.584247 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerStarted","Data":"0fbe9df95816c6cff2bb12345300d473a4691a08c6e77ebeb415471f3c6a2955"} Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.589823 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" event={"ID":"19474f89-6d04-494d-93d5-235f873cfcdb","Type":"ContainerDied","Data":"22f662f2e4262456a15c9487fda76d219bf80076a89f5112e04e6eaad0cdf077"} Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.589930 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mr52c" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.633969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config\") pod \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634102 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc\") pod \"19474f89-6d04-494d-93d5-235f873cfcdb\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2wb5\" (UniqueName: \"kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5\") pod \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\" (UID: \"f9fb1255-a854-45aa-8f7b-1cac97290ec4\") " Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634369 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config\") pod \"19474f89-6d04-494d-93d5-235f873cfcdb\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4gj5\" (UniqueName: \"kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5\") pod \"19474f89-6d04-494d-93d5-235f873cfcdb\" (UID: \"19474f89-6d04-494d-93d5-235f873cfcdb\") " Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634471 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config" (OuterVolumeSpecName: "config") pod "f9fb1255-a854-45aa-8f7b-1cac97290ec4" (UID: "f9fb1255-a854-45aa-8f7b-1cac97290ec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.634847 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19474f89-6d04-494d-93d5-235f873cfcdb" (UID: "19474f89-6d04-494d-93d5-235f873cfcdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.635264 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config" (OuterVolumeSpecName: "config") pod "19474f89-6d04-494d-93d5-235f873cfcdb" (UID: "19474f89-6d04-494d-93d5-235f873cfcdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.636011 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.636034 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19474f89-6d04-494d-93d5-235f873cfcdb-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.636062 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9fb1255-a854-45aa-8f7b-1cac97290ec4-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.706391 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5" (OuterVolumeSpecName: "kube-api-access-j2wb5") pod "f9fb1255-a854-45aa-8f7b-1cac97290ec4" (UID: "f9fb1255-a854-45aa-8f7b-1cac97290ec4"). InnerVolumeSpecName "kube-api-access-j2wb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.719219 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5" (OuterVolumeSpecName: "kube-api-access-q4gj5") pod "19474f89-6d04-494d-93d5-235f873cfcdb" (UID: "19474f89-6d04-494d-93d5-235f873cfcdb"). InnerVolumeSpecName "kube-api-access-q4gj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.728786 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79df5895fd-4nxm5"] Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.751044 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4gj5\" (UniqueName: \"kubernetes.io/projected/19474f89-6d04-494d-93d5-235f873cfcdb-kube-api-access-q4gj5\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.751069 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2wb5\" (UniqueName: \"kubernetes.io/projected/f9fb1255-a854-45aa-8f7b-1cac97290ec4-kube-api-access-j2wb5\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.937231 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.952367 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q8z8q"] Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.974679 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.984215 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mr52c"] Mar 14 05:50:34 crc kubenswrapper[4713]: W0314 05:50:34.993509 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04b986f_f5da_4458_93bc_c093c0f8a24b.slice/crio-7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490 WatchSource:0}: Error finding container 7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490: Status 404 returned error can't find the container with id 7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490 Mar 14 05:50:34 crc kubenswrapper[4713]: I0314 05:50:34.995010 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.553749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d"] Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.581571 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19474f89-6d04-494d-93d5-235f873cfcdb" path="/var/lib/kubelet/pods/19474f89-6d04-494d-93d5-235f873cfcdb/volumes" Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.582466 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9fb1255-a854-45aa-8f7b-1cac97290ec4" path="/var/lib/kubelet/pods/f9fb1255-a854-45aa-8f7b-1cac97290ec4/volumes" Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.582832 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lk79w"] Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.606342 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.610995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79df5895fd-4nxm5" event={"ID":"635fd3d7-4984-4dae-9416-068a4d020d75","Type":"ContainerStarted","Data":"153cd0c1dc9933e97cae88de960bf4815fac02582d26c23c2910a4dad685d566"} Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.613482 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f04b986f-f5da-4458-93bc-c093c0f8a24b","Type":"ContainerStarted","Data":"7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490"} Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.614828 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 05:50:35 crc kubenswrapper[4713]: W0314 05:50:35.616456 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265f475a_c481_4f65_a176_b3d4c55c691d.slice/crio-235bccb7c2f08f48e8701f19d169c54b030239799d94eb6dee6ec79d19438ebe WatchSource:0}: Error finding container 235bccb7c2f08f48e8701f19d169c54b030239799d94eb6dee6ec79d19438ebe: Status 404 returned error can't find the container with id 235bccb7c2f08f48e8701f19d169c54b030239799d94eb6dee6ec79d19438ebe Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.619100 4713 generic.go:334] "Generic (PLEG): container finished" podID="4feb0fd5-d952-4323-821e-187c23e16463" containerID="74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879" exitCode=0 Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.619145 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" event={"ID":"4feb0fd5-d952-4323-821e-187c23e16463","Type":"ContainerDied","Data":"74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879"} Mar 14 05:50:35 crc kubenswrapper[4713]: W0314 05:50:35.821628 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc32295a7_c2a9_461b_a4eb_9d9eeb2fc645.slice/crio-11a813344bfe46ecb41ec3a76b4265d4867059b6152e133983f1bb70e70fb0a9 WatchSource:0}: Error finding container 11a813344bfe46ecb41ec3a76b4265d4867059b6152e133983f1bb70e70fb0a9: Status 404 returned error can't find the container with id 11a813344bfe46ecb41ec3a76b4265d4867059b6152e133983f1bb70e70fb0a9 Mar 14 05:50:35 crc kubenswrapper[4713]: I0314 05:50:35.950505 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 05:50:36 crc kubenswrapper[4713]: W0314 05:50:36.312701 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92b8fdd2_6f8a_46d6_b301_d4e7e5aeb4f7.slice/crio-99aa7dd154660ec9edcc60ff4c14e6ace9997aebf9d4c1baf825bb313375720f WatchSource:0}: Error finding container 99aa7dd154660ec9edcc60ff4c14e6ace9997aebf9d4c1baf825bb313375720f: Status 404 returned error can't find the container with id 99aa7dd154660ec9edcc60ff4c14e6ace9997aebf9d4c1baf825bb313375720f Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.632019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerStarted","Data":"aa4a5ceb1b002ee0c5369d9f8a6166c1e3452680788305e7434abe62eb526493"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.633400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7","Type":"ContainerStarted","Data":"99aa7dd154660ec9edcc60ff4c14e6ace9997aebf9d4c1baf825bb313375720f"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.637164 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" event={"ID":"265f475a-c481-4f65-a176-b3d4c55c691d","Type":"ContainerStarted","Data":"235bccb7c2f08f48e8701f19d169c54b030239799d94eb6dee6ec79d19438ebe"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.638809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79df5895fd-4nxm5" event={"ID":"635fd3d7-4984-4dae-9416-068a4d020d75","Type":"ContainerStarted","Data":"eeb3e03fa7bd426ee9d2bef8c7e4bccffda4caa2acf50867938f1ca4891ed373"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.641904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lk79w" event={"ID":"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca","Type":"ContainerStarted","Data":"c7b8573e6f58a6ca2607455275a6904a1c42117a6da7c4eefd1f3b0935b3cdc8"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.643663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b","Type":"ContainerStarted","Data":"de4ea962ee8951cc4b778cdbaff4a65e6f9b81ed7851c34ab205db6f126b3b9e"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.645242 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerStarted","Data":"757454825f7b8736377c878e350329ecf32a743829e7db7e77afe631eff684e5"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.650982 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerStarted","Data":"6d464f5b92a469a58eed5cef7c3b96f48a6b9cba513745ea0786417981a94f06"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.657868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerStarted","Data":"d4c0f09392624ed19b999fbb5f7689557216f97a3ec3122b4ec0efdb2a5ceb6e"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.661363 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645","Type":"ContainerStarted","Data":"11a813344bfe46ecb41ec3a76b4265d4867059b6152e133983f1bb70e70fb0a9"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.666860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerStarted","Data":"b7633291ade9acbe506776637d909a242ceebe0598ca068ba05eaa459df7fa4b"} Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.690058 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79df5895fd-4nxm5" podStartSLOduration=15.690038611 podStartE2EDuration="15.690038611s" podCreationTimestamp="2026-03-14 05:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:50:36.685135874 +0000 UTC m=+1419.773045184" watchObservedRunningTime="2026-03-14 05:50:36.690038611 +0000 UTC m=+1419.777947911" Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.814038 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6pzwm"] Mar 14 05:50:36 crc kubenswrapper[4713]: W0314 05:50:36.844560 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1475dc78_ec5d_45b6_a21d_0e6e6320a012.slice/crio-2213124fba0df503e4ca4c7c2d8e378243fea4e7bb8a2f2d67d84b3553816717 WatchSource:0}: Error finding container 2213124fba0df503e4ca4c7c2d8e378243fea4e7bb8a2f2d67d84b3553816717: Status 404 returned error can't find the container with id 2213124fba0df503e4ca4c7c2d8e378243fea4e7bb8a2f2d67d84b3553816717 Mar 14 05:50:36 crc kubenswrapper[4713]: I0314 05:50:36.869481 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 05:50:36 crc kubenswrapper[4713]: W0314 05:50:36.922880 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30edab6_1aa0_47d8_a20d_d2d2d0d6185d.slice/crio-967a7c1d521150117a0adf0d6595f171a4edade6618efb6fbf6c06c34887c3b8 WatchSource:0}: Error finding container 967a7c1d521150117a0adf0d6595f171a4edade6618efb6fbf6c06c34887c3b8: Status 404 returned error can't find the container with id 967a7c1d521150117a0adf0d6595f171a4edade6618efb6fbf6c06c34887c3b8 Mar 14 05:50:37 crc kubenswrapper[4713]: I0314 05:50:37.686455 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" event={"ID":"4feb0fd5-d952-4323-821e-187c23e16463","Type":"ContainerStarted","Data":"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616"} Mar 14 05:50:37 crc kubenswrapper[4713]: I0314 05:50:37.686603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:37 crc kubenswrapper[4713]: I0314 05:50:37.687884 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6pzwm" event={"ID":"1475dc78-ec5d-45b6-a21d-0e6e6320a012","Type":"ContainerStarted","Data":"2213124fba0df503e4ca4c7c2d8e378243fea4e7bb8a2f2d67d84b3553816717"} Mar 14 05:50:37 crc kubenswrapper[4713]: I0314 05:50:37.693807 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d","Type":"ContainerStarted","Data":"967a7c1d521150117a0adf0d6595f171a4edade6618efb6fbf6c06c34887c3b8"} Mar 14 05:50:37 crc kubenswrapper[4713]: I0314 05:50:37.811551 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" podStartSLOduration=4.553397012 podStartE2EDuration="23.811527496s" podCreationTimestamp="2026-03-14 05:50:14 +0000 UTC" firstStartedPulling="2026-03-14 05:50:15.043837162 +0000 UTC m=+1398.131746462" lastFinishedPulling="2026-03-14 05:50:34.301967646 +0000 UTC m=+1417.389876946" observedRunningTime="2026-03-14 05:50:37.799966678 +0000 UTC m=+1420.887875988" watchObservedRunningTime="2026-03-14 05:50:37.811527496 +0000 UTC m=+1420.899436796" Mar 14 05:50:39 crc kubenswrapper[4713]: I0314 05:50:39.714181 4713 generic.go:334] "Generic (PLEG): container finished" podID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerID="b7633291ade9acbe506776637d909a242ceebe0598ca068ba05eaa459df7fa4b" exitCode=0 Mar 14 05:50:39 crc kubenswrapper[4713]: I0314 05:50:39.714272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerDied","Data":"b7633291ade9acbe506776637d909a242ceebe0598ca068ba05eaa459df7fa4b"} Mar 14 05:50:41 crc kubenswrapper[4713]: I0314 05:50:41.740423 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b","Type":"ContainerStarted","Data":"3a40b1b5179e261c631568cecc696c7f195703f3d4a5d3db8bb725c738da9be2"} Mar 14 05:50:41 crc kubenswrapper[4713]: I0314 05:50:41.924096 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:41 crc kubenswrapper[4713]: I0314 05:50:41.924150 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:41 crc kubenswrapper[4713]: I0314 05:50:41.931248 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.751819 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerStarted","Data":"14748d02753ab502f9ae5a5c30c424c7a250193acb463caf9edf740cde85c571"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.753635 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7","Type":"ContainerStarted","Data":"eb5f4c976cb61030fbad1b927008052fc044f7a39a0a15bd78369c29e6b90f54"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.755657 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" event={"ID":"265f475a-c481-4f65-a176-b3d4c55c691d","Type":"ContainerStarted","Data":"0190ec733ce4a928400be8d2597fb8a6ac3c98f160ef09684d253aec39c30cb6"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.757020 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f04b986f-f5da-4458-93bc-c093c0f8a24b","Type":"ContainerStarted","Data":"d72529fc4cdf24bdd4a9ebde5d7efc3d516026012343812cdcb517c23427bba3"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.757097 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.758715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c32295a7-c2a9-461b-a4eb-9d9eeb2fc645","Type":"ContainerStarted","Data":"cfaab3602253232b55b047db2efb2a221d7512dcb486e1f38184a8f53a07b3ea"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.758872 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.760614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lk79w" event={"ID":"2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca","Type":"ContainerStarted","Data":"222da70cca4938a6d2ab5e03111c397bdc09076f4d8866af5b8cb6360f996e54"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.761715 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lk79w" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.763384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6pzwm" event={"ID":"1475dc78-ec5d-45b6-a21d-0e6e6320a012","Type":"ContainerStarted","Data":"49604bd66aa9cf1fc9531f0561e4f77d187a96374e559a42a8acd42af149c7be"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.765865 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d","Type":"ContainerStarted","Data":"6e8408b0d024714120d5c64020de6d38a9e58eea143aaaafa3e41e7b13b02cc0"} Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.769649 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.845674 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.592033617 podStartE2EDuration="27.845647691s" podCreationTimestamp="2026-03-14 05:50:15 +0000 UTC" firstStartedPulling="2026-03-14 05:50:17.145669764 +0000 UTC m=+1400.233579064" lastFinishedPulling="2026-03-14 05:50:34.399283838 +0000 UTC m=+1417.487193138" observedRunningTime="2026-03-14 05:50:42.78001939 +0000 UTC m=+1425.867928690" watchObservedRunningTime="2026-03-14 05:50:42.845647691 +0000 UTC m=+1425.933556991" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.853921 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hj46d" podStartSLOduration=16.559839235 podStartE2EDuration="21.853902874s" podCreationTimestamp="2026-03-14 05:50:21 +0000 UTC" firstStartedPulling="2026-03-14 05:50:35.618642312 +0000 UTC m=+1418.706551612" lastFinishedPulling="2026-03-14 05:50:40.912705951 +0000 UTC m=+1424.000615251" observedRunningTime="2026-03-14 05:50:42.80386082 +0000 UTC m=+1425.891770120" watchObservedRunningTime="2026-03-14 05:50:42.853902874 +0000 UTC m=+1425.941812174" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.867515 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.91911097 podStartE2EDuration="22.867494187s" podCreationTimestamp="2026-03-14 05:50:20 +0000 UTC" firstStartedPulling="2026-03-14 05:50:34.996001247 +0000 UTC m=+1418.083910547" lastFinishedPulling="2026-03-14 05:50:41.944384464 +0000 UTC m=+1425.032293764" observedRunningTime="2026-03-14 05:50:42.822412291 +0000 UTC m=+1425.910321591" watchObservedRunningTime="2026-03-14 05:50:42.867494187 +0000 UTC m=+1425.955403487" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.902877 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lk79w" podStartSLOduration=13.744870878 podStartE2EDuration="19.902855993s" podCreationTimestamp="2026-03-14 05:50:23 +0000 UTC" firstStartedPulling="2026-03-14 05:50:35.71268607 +0000 UTC m=+1418.800595380" lastFinishedPulling="2026-03-14 05:50:41.870671195 +0000 UTC m=+1424.958580495" observedRunningTime="2026-03-14 05:50:42.83964798 +0000 UTC m=+1425.927557290" watchObservedRunningTime="2026-03-14 05:50:42.902855993 +0000 UTC m=+1425.990765303" Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.961282 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:50:42 crc kubenswrapper[4713]: I0314 05:50:42.964007 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.922685454 podStartE2EDuration="25.96398141s" podCreationTimestamp="2026-03-14 05:50:17 +0000 UTC" firstStartedPulling="2026-03-14 05:50:35.825523606 +0000 UTC m=+1418.913432906" lastFinishedPulling="2026-03-14 05:50:41.866819562 +0000 UTC m=+1424.954728862" observedRunningTime="2026-03-14 05:50:42.906612153 +0000 UTC m=+1425.994521453" watchObservedRunningTime="2026-03-14 05:50:42.96398141 +0000 UTC m=+1426.051890710" Mar 14 05:50:43 crc kubenswrapper[4713]: I0314 05:50:43.778996 4713 generic.go:334] "Generic (PLEG): container finished" podID="1475dc78-ec5d-45b6-a21d-0e6e6320a012" containerID="49604bd66aa9cf1fc9531f0561e4f77d187a96374e559a42a8acd42af149c7be" exitCode=0 Mar 14 05:50:43 crc kubenswrapper[4713]: I0314 05:50:43.779100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6pzwm" event={"ID":"1475dc78-ec5d-45b6-a21d-0e6e6320a012","Type":"ContainerDied","Data":"49604bd66aa9cf1fc9531f0561e4f77d187a96374e559a42a8acd42af149c7be"} Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.426335 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.484849 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.793380 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerStarted","Data":"5d86cd1a4eac22c48e3f31d996967f870b5305aea6e524f8a9ae9abe1370021a"} Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.795611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6pzwm" event={"ID":"1475dc78-ec5d-45b6-a21d-0e6e6320a012","Type":"ContainerStarted","Data":"91007ee27921026c8cf19f1c5e7d1d157e36603e76741da0f8dd1df769d0e91e"} Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.795646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6pzwm" event={"ID":"1475dc78-ec5d-45b6-a21d-0e6e6320a012","Type":"ContainerStarted","Data":"d20bcd50e3ddd0cd4d96a0282235751fdc71b5c41e09c290ace195336c7e27fa"} Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.796446 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:44 crc kubenswrapper[4713]: I0314 05:50:44.907665 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6pzwm" podStartSLOduration=16.892531333 podStartE2EDuration="21.907642331s" podCreationTimestamp="2026-03-14 05:50:23 +0000 UTC" firstStartedPulling="2026-03-14 05:50:36.851803047 +0000 UTC m=+1419.939712347" lastFinishedPulling="2026-03-14 05:50:41.866914055 +0000 UTC m=+1424.954823345" observedRunningTime="2026-03-14 05:50:44.900122312 +0000 UTC m=+1427.988031632" watchObservedRunningTime="2026-03-14 05:50:44.907642331 +0000 UTC m=+1427.995551641" Mar 14 05:50:45 crc kubenswrapper[4713]: I0314 05:50:45.809318 4713 generic.go:334] "Generic (PLEG): container finished" podID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerID="3a40b1b5179e261c631568cecc696c7f195703f3d4a5d3db8bb725c738da9be2" exitCode=0 Mar 14 05:50:45 crc kubenswrapper[4713]: I0314 05:50:45.809398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b","Type":"ContainerDied","Data":"3a40b1b5179e261c631568cecc696c7f195703f3d4a5d3db8bb725c738da9be2"} Mar 14 05:50:45 crc kubenswrapper[4713]: I0314 05:50:45.810944 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.233486 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.260410 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc\") pod \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.260569 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-724zc\" (UniqueName: \"kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc\") pod \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.260783 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config\") pod \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\" (UID: \"58e4241d-f7c9-4c22-9f7a-9391a358eda4\") " Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.261251 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58e4241d-f7c9-4c22-9f7a-9391a358eda4" (UID: "58e4241d-f7c9-4c22-9f7a-9391a358eda4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.261327 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config" (OuterVolumeSpecName: "config") pod "58e4241d-f7c9-4c22-9f7a-9391a358eda4" (UID: "58e4241d-f7c9-4c22-9f7a-9391a358eda4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.261982 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.262081 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e4241d-f7c9-4c22-9f7a-9391a358eda4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.269588 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc" (OuterVolumeSpecName: "kube-api-access-724zc") pod "58e4241d-f7c9-4c22-9f7a-9391a358eda4" (UID: "58e4241d-f7c9-4c22-9f7a-9391a358eda4"). InnerVolumeSpecName "kube-api-access-724zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.365393 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-724zc\" (UniqueName: \"kubernetes.io/projected/58e4241d-f7c9-4c22-9f7a-9391a358eda4-kube-api-access-724zc\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.539219 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.539265 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.824587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" event={"ID":"58e4241d-f7c9-4c22-9f7a-9391a358eda4","Type":"ContainerDied","Data":"f003af921eb31ffc2d3a291a166b2edfa02caddc45c55b3c6a350794a67626b8"} Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.824622 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-bq2gv" Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.947391 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:46 crc kubenswrapper[4713]: I0314 05:50:46.956214 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-bq2gv"] Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.594365 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e4241d-f7c9-4c22-9f7a-9391a358eda4" path="/var/lib/kubelet/pods/58e4241d-f7c9-4c22-9f7a-9391a358eda4/volumes" Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.837299 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7","Type":"ContainerStarted","Data":"872a07de39d0141da6906311721a8fc993a8278d9e622605ce63a885d997f62a"} Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.842687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b","Type":"ContainerStarted","Data":"59e02f29178cf81ce4027336a1d45bd616a98cedaf79c7045e533c5f58de2e9f"} Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.846038 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d30edab6-1aa0-47d8-a20d-d2d2d0d6185d","Type":"ContainerStarted","Data":"1956f125a35109c06b6ef3ebd2421a139aff7ee5981977e8b856a1edf464b940"} Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.865328 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.285223931 podStartE2EDuration="22.865309021s" podCreationTimestamp="2026-03-14 05:50:25 +0000 UTC" firstStartedPulling="2026-03-14 05:50:36.31603495 +0000 UTC m=+1419.403944260" lastFinishedPulling="2026-03-14 05:50:46.89612005 +0000 UTC m=+1429.984029350" observedRunningTime="2026-03-14 05:50:47.856927534 +0000 UTC m=+1430.944836834" watchObservedRunningTime="2026-03-14 05:50:47.865309021 +0000 UTC m=+1430.953218321" Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.891835 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=31.891813295 podStartE2EDuration="31.891813295s" podCreationTimestamp="2026-03-14 05:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:50:47.884696889 +0000 UTC m=+1430.972606199" watchObservedRunningTime="2026-03-14 05:50:47.891813295 +0000 UTC m=+1430.979722595" Mar 14 05:50:47 crc kubenswrapper[4713]: I0314 05:50:47.917671 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.950073451 podStartE2EDuration="24.917621817s" podCreationTimestamp="2026-03-14 05:50:23 +0000 UTC" firstStartedPulling="2026-03-14 05:50:36.929119551 +0000 UTC m=+1420.017028851" lastFinishedPulling="2026-03-14 05:50:46.896667917 +0000 UTC m=+1429.984577217" observedRunningTime="2026-03-14 05:50:47.910566993 +0000 UTC m=+1430.998476303" watchObservedRunningTime="2026-03-14 05:50:47.917621817 +0000 UTC m=+1431.005531117" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.162077 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.162151 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.233685 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.362483 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.416124 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.847590 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.856807 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.886009 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:48 crc kubenswrapper[4713]: I0314 05:50:48.894565 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.125587 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.207539 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.210296 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.218625 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.241823 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.314658 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-shmpz"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.316916 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.324864 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.337266 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.337390 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.337415 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.337447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgpl7\" (UniqueName: \"kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.414770 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-shmpz"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.433091 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.442511 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.442561 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.442922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cc6g\" (UniqueName: \"kubernetes.io/projected/165fd5f2-33fe-4736-8787-75331305fc9b-kube-api-access-9cc6g\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.442953 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.442975 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443008 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-combined-ca-bundle\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443037 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgpl7\" (UniqueName: \"kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165fd5f2-33fe-4736-8787-75331305fc9b-config\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovs-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443150 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovn-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.443961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.444527 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.445063 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.480686 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgpl7\" (UniqueName: \"kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7\") pod \"dnsmasq-dns-6bc7876d45-xfkcd\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.543779 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545373 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cc6g\" (UniqueName: \"kubernetes.io/projected/165fd5f2-33fe-4736-8787-75331305fc9b-kube-api-access-9cc6g\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-combined-ca-bundle\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545513 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165fd5f2-33fe-4736-8787-75331305fc9b-config\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovs-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545567 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovn-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.545595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.546454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovs-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.546656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165fd5f2-33fe-4736-8787-75331305fc9b-config\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.546845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/165fd5f2-33fe-4736-8787-75331305fc9b-ovn-rundir\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.551128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.554298 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165fd5f2-33fe-4736-8787-75331305fc9b-combined-ca-bundle\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.607910 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cc6g\" (UniqueName: \"kubernetes.io/projected/165fd5f2-33fe-4736-8787-75331305fc9b-kube-api-access-9cc6g\") pod \"ovn-controller-metrics-shmpz\" (UID: \"165fd5f2-33fe-4736-8787-75331305fc9b\") " pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.624991 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.656128 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-shmpz" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.672181 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.674981 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.680157 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.697569 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.754682 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.754747 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.754790 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rhw5\" (UniqueName: \"kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.754922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.755067 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.848011 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.857901 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.857989 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.858051 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.858105 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rhw5\" (UniqueName: \"kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.858198 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.859369 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.861435 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.861750 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.862048 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.881027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rhw5\" (UniqueName: \"kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5\") pod \"dnsmasq-dns-8554648995-8fkzt\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:49 crc kubenswrapper[4713]: I0314 05:50:49.901011 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.082999 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.128436 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.138281 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.144684 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.144881 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.145038 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.145293 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tgb76" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.160961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166333 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166424 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-config\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166445 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-scripts\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166675 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpk9d\" (UniqueName: \"kubernetes.io/projected/ca274e3b-b1c1-4083-8a05-7b9a536fe088-kube-api-access-jpk9d\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.166807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.175589 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.270870 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-scripts\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.269796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-scripts\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpk9d\" (UniqueName: \"kubernetes.io/projected/ca274e3b-b1c1-4083-8a05-7b9a536fe088-kube-api-access-jpk9d\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289442 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289571 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289828 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-config\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.289907 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.291719 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.292189 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca274e3b-b1c1-4083-8a05-7b9a536fe088-config\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.298519 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.300334 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.301801 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca274e3b-b1c1-4083-8a05-7b9a536fe088-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.303542 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-shmpz"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.313341 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpk9d\" (UniqueName: \"kubernetes.io/projected/ca274e3b-b1c1-4083-8a05-7b9a536fe088-kube-api-access-jpk9d\") pod \"ovn-northd-0\" (UID: \"ca274e3b-b1c1-4083-8a05-7b9a536fe088\") " pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: W0314 05:50:50.337196 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165fd5f2_33fe_4736_8787_75331305fc9b.slice/crio-0d6f8bc39ee83869e85e58416588c29ebefc24c51dd68185b3f56a7ec4647468 WatchSource:0}: Error finding container 0d6f8bc39ee83869e85e58416588c29ebefc24c51dd68185b3f56a7ec4647468: Status 404 returned error can't find the container with id 0d6f8bc39ee83869e85e58416588c29ebefc24c51dd68185b3f56a7ec4647468 Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.487887 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.606592 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.638788 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.659278 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c7l2k"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.660787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.706619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8nk\" (UniqueName: \"kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.707033 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.715808 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.717377 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.727107 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c7l2k"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.742288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.809765 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.809826 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvnq\" (UniqueName: \"kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.809879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8nk\" (UniqueName: \"kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.809923 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.810055 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.810100 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.810127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.811451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.917747 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.917809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.917834 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.917887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.917920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glvnq\" (UniqueName: \"kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.919406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.919949 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.920078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.920658 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.944227 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:50 crc kubenswrapper[4713]: I0314 05:50:50.954467 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-shmpz" event={"ID":"165fd5f2-33fe-4736-8787-75331305fc9b","Type":"ContainerStarted","Data":"0d6f8bc39ee83869e85e58416588c29ebefc24c51dd68185b3f56a7ec4647468"} Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.035783 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" podUID="79d05c7e-d928-420d-a903-5f6d40a34631" containerName="init" containerID="cri-o://fa537248ef2e74fbe75cec258207320b4214f16415b8a3257faa2bbca12b6a72" gracePeriod=10 Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.040539 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" event={"ID":"79d05c7e-d928-420d-a903-5f6d40a34631","Type":"ContainerStarted","Data":"fa537248ef2e74fbe75cec258207320b4214f16415b8a3257faa2bbca12b6a72"} Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.047540 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" event={"ID":"79d05c7e-d928-420d-a903-5f6d40a34631","Type":"ContainerStarted","Data":"04802d8918f8cd8e0b52a84f577b587bec6dd9f44780e2a7a19fa96f15a0a6c7"} Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.083026 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8nk\" (UniqueName: \"kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk\") pod \"mysqld-exporter-openstack-db-create-c7l2k\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.095971 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glvnq\" (UniqueName: \"kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq\") pod \"dnsmasq-dns-b8fbc5445-24qbb\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.226044 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-a0d1-account-create-update-nzqcb"] Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.242729 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.249313 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.258008 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-a0d1-account-create-update-nzqcb"] Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.353544 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v929f\" (UniqueName: \"kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.354230 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.366934 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.394968 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.456528 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v929f\" (UniqueName: \"kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.456635 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.457490 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.486241 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v929f\" (UniqueName: \"kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f\") pod \"mysqld-exporter-a0d1-account-create-update-nzqcb\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.579437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.622978 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.844611 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.867180 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.867326 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.870487 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.870911 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.871394 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.873175 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mv8rf" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.974940 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb5f1653-3d8a-404c-842a-792462b53c54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb5f1653-3d8a-404c-842a-792462b53c54\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.975042 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f03c3b-d224-4e9d-8e52-e0376b3f215f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.975068 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw2f\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-kube-api-access-pbw2f\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.975169 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.975189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-lock\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:51 crc kubenswrapper[4713]: I0314 05:50:51.975237 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-cache\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.014696 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c7l2k"] Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.044699 4713 generic.go:334] "Generic (PLEG): container finished" podID="79d05c7e-d928-420d-a903-5f6d40a34631" containerID="fa537248ef2e74fbe75cec258207320b4214f16415b8a3257faa2bbca12b6a72" exitCode=0 Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.044819 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" event={"ID":"79d05c7e-d928-420d-a903-5f6d40a34631","Type":"ContainerDied","Data":"fa537248ef2e74fbe75cec258207320b4214f16415b8a3257faa2bbca12b6a72"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.046932 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" event={"ID":"d3603f0e-10ee-4007-a5b0-83dfacefa9d4","Type":"ContainerStarted","Data":"665b60c0f41bb7db3019ce9ce83fea241f9360073206b9a0c07778b15b239903"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.048103 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca274e3b-b1c1-4083-8a05-7b9a536fe088","Type":"ContainerStarted","Data":"ed1db4f2a2aaa05c8ff9a13bcbf68331143c213879ae8b8183d832bab80c861e"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.050292 4713 generic.go:334] "Generic (PLEG): container finished" podID="e586b4a9-3d6b-476a-a83a-835aae29e7cc" containerID="daf34abaf83627c1e48e3c64420e3531a6f12ff897e947dbe72f138ac2ef0cf2" exitCode=0 Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.050370 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8fkzt" event={"ID":"e586b4a9-3d6b-476a-a83a-835aae29e7cc","Type":"ContainerDied","Data":"daf34abaf83627c1e48e3c64420e3531a6f12ff897e947dbe72f138ac2ef0cf2"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.050391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8fkzt" event={"ID":"e586b4a9-3d6b-476a-a83a-835aae29e7cc","Type":"ContainerStarted","Data":"a610d07f61a55979f577ca726ffeea1034a9558c560a6dfe48d5ec996981ec91"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.052136 4713 generic.go:334] "Generic (PLEG): container finished" podID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerID="5d86cd1a4eac22c48e3f31d996967f870b5305aea6e524f8a9ae9abe1370021a" exitCode=0 Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.052217 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerDied","Data":"5d86cd1a4eac22c48e3f31d996967f870b5305aea6e524f8a9ae9abe1370021a"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.054427 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-shmpz" event={"ID":"165fd5f2-33fe-4736-8787-75331305fc9b","Type":"ContainerStarted","Data":"0e9fb1f92895d007b24e8044389a289797da080f046c6013a4f996ce8d1e4179"} Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085319 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085377 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-lock\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085432 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-cache\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085496 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb5f1653-3d8a-404c-842a-792462b53c54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb5f1653-3d8a-404c-842a-792462b53c54\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085594 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f03c3b-d224-4e9d-8e52-e0376b3f215f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.085626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw2f\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-kube-api-access-pbw2f\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.086112 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.086130 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.086168 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:50:52.586152446 +0000 UTC m=+1435.674061746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.086680 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-lock\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.088635 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80f03c3b-d224-4e9d-8e52-e0376b3f215f-cache\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.090658 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.090691 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb5f1653-3d8a-404c-842a-792462b53c54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb5f1653-3d8a-404c-842a-792462b53c54\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc57282380f5a002bc7f921686157b385ce5630faba202bf93246a73afbd18a/globalmount\"" pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.100220 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f03c3b-d224-4e9d-8e52-e0376b3f215f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.129739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw2f\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-kube-api-access-pbw2f\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.162954 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-shmpz" podStartSLOduration=3.162930342 podStartE2EDuration="3.162930342s" podCreationTimestamp="2026-03-14 05:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:50:52.142739628 +0000 UTC m=+1435.230648928" watchObservedRunningTime="2026-03-14 05:50:52.162930342 +0000 UTC m=+1435.250839642" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.176429 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb5f1653-3d8a-404c-842a-792462b53c54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb5f1653-3d8a-404c-842a-792462b53c54\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.233613 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:50:52 crc kubenswrapper[4713]: W0314 05:50:52.243287 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd5a7aa_b08a_47d4_a730_73b10501f049.slice/crio-0e43763ac06eebf6279fe4e92d3271027125b0c081df2a3297a3dbdef6835dc7 WatchSource:0}: Error finding container 0e43763ac06eebf6279fe4e92d3271027125b0c081df2a3297a3dbdef6835dc7: Status 404 returned error can't find the container with id 0e43763ac06eebf6279fe4e92d3271027125b0c081df2a3297a3dbdef6835dc7 Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.390360 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-a0d1-account-create-update-nzqcb"] Mar 14 05:50:52 crc kubenswrapper[4713]: W0314 05:50:52.417390 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57dc5163_22bf_4b59_a601_2ca96749dead.slice/crio-64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe WatchSource:0}: Error finding container 64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe: Status 404 returned error can't find the container with id 64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.607696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.607824 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.607849 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: E0314 05:50:52.607906 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:50:53.607887975 +0000 UTC m=+1436.695797275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.720253 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.810916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rhw5\" (UniqueName: \"kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5\") pod \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.811316 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb\") pod \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.811368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config\") pod \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.811418 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc\") pod \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.811732 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb\") pod \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\" (UID: \"e586b4a9-3d6b-476a-a83a-835aae29e7cc\") " Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.818768 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5" (OuterVolumeSpecName: "kube-api-access-5rhw5") pod "e586b4a9-3d6b-476a-a83a-835aae29e7cc" (UID: "e586b4a9-3d6b-476a-a83a-835aae29e7cc"). InnerVolumeSpecName "kube-api-access-5rhw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.843573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e586b4a9-3d6b-476a-a83a-835aae29e7cc" (UID: "e586b4a9-3d6b-476a-a83a-835aae29e7cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.855710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e586b4a9-3d6b-476a-a83a-835aae29e7cc" (UID: "e586b4a9-3d6b-476a-a83a-835aae29e7cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.863335 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config" (OuterVolumeSpecName: "config") pod "e586b4a9-3d6b-476a-a83a-835aae29e7cc" (UID: "e586b4a9-3d6b-476a-a83a-835aae29e7cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.872035 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e586b4a9-3d6b-476a-a83a-835aae29e7cc" (UID: "e586b4a9-3d6b-476a-a83a-835aae29e7cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.915744 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rhw5\" (UniqueName: \"kubernetes.io/projected/e586b4a9-3d6b-476a-a83a-835aae29e7cc-kube-api-access-5rhw5\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.915782 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.915795 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.915806 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:52 crc kubenswrapper[4713]: I0314 05:50:52.915818 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e586b4a9-3d6b-476a-a83a-835aae29e7cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.124650 4713 generic.go:334] "Generic (PLEG): container finished" podID="d3603f0e-10ee-4007-a5b0-83dfacefa9d4" containerID="77d472d7bd1b0c1c2a37b871ae49f45d40c3269946e317cda0e4bb46ee6f5e5a" exitCode=0 Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.125281 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" event={"ID":"d3603f0e-10ee-4007-a5b0-83dfacefa9d4","Type":"ContainerDied","Data":"77d472d7bd1b0c1c2a37b871ae49f45d40c3269946e317cda0e4bb46ee6f5e5a"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.132538 4713 generic.go:334] "Generic (PLEG): container finished" podID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerID="ea46a39ac827800f8c57354eff9702ffd1f91e95da7c6bb89c50b12c73638fed" exitCode=0 Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.132614 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" event={"ID":"9fd5a7aa-b08a-47d4-a730-73b10501f049","Type":"ContainerDied","Data":"ea46a39ac827800f8c57354eff9702ffd1f91e95da7c6bb89c50b12c73638fed"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.132646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" event={"ID":"9fd5a7aa-b08a-47d4-a730-73b10501f049","Type":"ContainerStarted","Data":"0e43763ac06eebf6279fe4e92d3271027125b0c081df2a3297a3dbdef6835dc7"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.139065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" event={"ID":"57dc5163-22bf-4b59-a601-2ca96749dead","Type":"ContainerStarted","Data":"a21e67a562cec14af7da9e09edb66b2cc3e7548a15c59e814c467a77281edd21"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.139115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" event={"ID":"57dc5163-22bf-4b59-a601-2ca96749dead","Type":"ContainerStarted","Data":"64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.141799 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8fkzt" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.142093 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8fkzt" event={"ID":"e586b4a9-3d6b-476a-a83a-835aae29e7cc","Type":"ContainerDied","Data":"a610d07f61a55979f577ca726ffeea1034a9558c560a6dfe48d5ec996981ec91"} Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.142129 4713 scope.go:117] "RemoveContainer" containerID="daf34abaf83627c1e48e3c64420e3531a6f12ff897e947dbe72f138ac2ef0cf2" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.210463 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" podStartSLOduration=3.210440597 podStartE2EDuration="3.210440597s" podCreationTimestamp="2026-03-14 05:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:50:53.191966209 +0000 UTC m=+1436.279875519" watchObservedRunningTime="2026-03-14 05:50:53.210440597 +0000 UTC m=+1436.298349897" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.282055 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.293220 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.307898 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8fkzt"] Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.430791 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config\") pod \"79d05c7e-d928-420d-a903-5f6d40a34631\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.430938 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc\") pod \"79d05c7e-d928-420d-a903-5f6d40a34631\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.431084 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb\") pod \"79d05c7e-d928-420d-a903-5f6d40a34631\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.431334 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgpl7\" (UniqueName: \"kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7\") pod \"79d05c7e-d928-420d-a903-5f6d40a34631\" (UID: \"79d05c7e-d928-420d-a903-5f6d40a34631\") " Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.437978 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7" (OuterVolumeSpecName: "kube-api-access-tgpl7") pod "79d05c7e-d928-420d-a903-5f6d40a34631" (UID: "79d05c7e-d928-420d-a903-5f6d40a34631"). InnerVolumeSpecName "kube-api-access-tgpl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.458583 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79d05c7e-d928-420d-a903-5f6d40a34631" (UID: "79d05c7e-d928-420d-a903-5f6d40a34631"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.461440 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79d05c7e-d928-420d-a903-5f6d40a34631" (UID: "79d05c7e-d928-420d-a903-5f6d40a34631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.464526 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config" (OuterVolumeSpecName: "config") pod "79d05c7e-d928-420d-a903-5f6d40a34631" (UID: "79d05c7e-d928-420d-a903-5f6d40a34631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.533994 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.534322 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.534333 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgpl7\" (UniqueName: \"kubernetes.io/projected/79d05c7e-d928-420d-a903-5f6d40a34631-kube-api-access-tgpl7\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.534341 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d05c7e-d928-420d-a903-5f6d40a34631-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.579582 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e586b4a9-3d6b-476a-a83a-835aae29e7cc" path="/var/lib/kubelet/pods/e586b4a9-3d6b-476a-a83a-835aae29e7cc/volumes" Mar 14 05:50:53 crc kubenswrapper[4713]: I0314 05:50:53.635687 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:53 crc kubenswrapper[4713]: E0314 05:50:53.635872 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:50:53 crc kubenswrapper[4713]: E0314 05:50:53.635891 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:50:53 crc kubenswrapper[4713]: E0314 05:50:53.635954 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:50:55.63593769 +0000 UTC m=+1438.723846990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.151128 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" event={"ID":"9fd5a7aa-b08a-47d4-a730-73b10501f049","Type":"ContainerStarted","Data":"a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63"} Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.151656 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.157277 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" event={"ID":"79d05c7e-d928-420d-a903-5f6d40a34631","Type":"ContainerDied","Data":"04802d8918f8cd8e0b52a84f577b587bec6dd9f44780e2a7a19fa96f15a0a6c7"} Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.157343 4713 scope.go:117] "RemoveContainer" containerID="fa537248ef2e74fbe75cec258207320b4214f16415b8a3257faa2bbca12b6a72" Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.157441 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-xfkcd" Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.172262 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podStartSLOduration=4.172240193 podStartE2EDuration="4.172240193s" podCreationTimestamp="2026-03-14 05:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:50:54.16961285 +0000 UTC m=+1437.257522160" watchObservedRunningTime="2026-03-14 05:50:54.172240193 +0000 UTC m=+1437.260149493" Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.236623 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:54 crc kubenswrapper[4713]: I0314 05:50:54.248583 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-xfkcd"] Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.145227 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x8vvq"] Mar 14 05:50:55 crc kubenswrapper[4713]: E0314 05:50:55.145844 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e586b4a9-3d6b-476a-a83a-835aae29e7cc" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.145947 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e586b4a9-3d6b-476a-a83a-835aae29e7cc" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: E0314 05:50:55.146017 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d05c7e-d928-420d-a903-5f6d40a34631" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.146067 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d05c7e-d928-420d-a903-5f6d40a34631" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.146437 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e586b4a9-3d6b-476a-a83a-835aae29e7cc" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.146607 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d05c7e-d928-420d-a903-5f6d40a34631" containerName="init" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.147330 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.149612 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.179627 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x8vvq"] Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.268984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghw4\" (UniqueName: \"kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.269719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.372354 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.372455 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghw4\" (UniqueName: \"kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.373189 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.391798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghw4\" (UniqueName: \"kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4\") pod \"root-account-create-update-x8vvq\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.470896 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x8vvq" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.581843 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d05c7e-d928-420d-a903-5f6d40a34631" path="/var/lib/kubelet/pods/79d05c7e-d928-420d-a903-5f6d40a34631/volumes" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.638998 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qc8rx"] Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.640601 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.644103 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.644333 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.649446 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.652573 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qc8rx"] Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.678347 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.678904 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.679136 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.679177 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:55 crc kubenswrapper[4713]: E0314 05:50:55.679591 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:50:55 crc kubenswrapper[4713]: E0314 05:50:55.679645 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:50:55 crc kubenswrapper[4713]: E0314 05:50:55.679725 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:50:59.679700959 +0000 UTC m=+1442.767610339 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.680299 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9r4\" (UniqueName: \"kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.680417 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.680468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.680534 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782012 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782078 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9r4\" (UniqueName: \"kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782246 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.782316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.783240 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.783293 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.783417 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.785455 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.785549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.787296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.801562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9r4\" (UniqueName: \"kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4\") pod \"swift-ring-rebalance-qc8rx\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:55 crc kubenswrapper[4713]: I0314 05:50:55.974339 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:50:56 crc kubenswrapper[4713]: I0314 05:50:56.187342 4713 generic.go:334] "Generic (PLEG): container finished" podID="57dc5163-22bf-4b59-a601-2ca96749dead" containerID="a21e67a562cec14af7da9e09edb66b2cc3e7548a15c59e814c467a77281edd21" exitCode=0 Mar 14 05:50:56 crc kubenswrapper[4713]: I0314 05:50:56.187398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" event={"ID":"57dc5163-22bf-4b59-a601-2ca96749dead","Type":"ContainerDied","Data":"a21e67a562cec14af7da9e09edb66b2cc3e7548a15c59e814c467a77281edd21"} Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.245969 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.247108 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" event={"ID":"57dc5163-22bf-4b59-a601-2ca96749dead","Type":"ContainerDied","Data":"64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe"} Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.247133 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64c0ed162aaef894fd7d123506eeca05d08857e641920a3bdea845f4c61e10fe" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.249719 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.344179 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v929f\" (UniqueName: \"kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f\") pod \"57dc5163-22bf-4b59-a601-2ca96749dead\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.344234 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts\") pod \"57dc5163-22bf-4b59-a601-2ca96749dead\" (UID: \"57dc5163-22bf-4b59-a601-2ca96749dead\") " Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.344349 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts\") pod \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.344438 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r8nk\" (UniqueName: \"kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk\") pod \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\" (UID: \"d3603f0e-10ee-4007-a5b0-83dfacefa9d4\") " Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.347980 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57dc5163-22bf-4b59-a601-2ca96749dead" (UID: "57dc5163-22bf-4b59-a601-2ca96749dead"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.357921 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3603f0e-10ee-4007-a5b0-83dfacefa9d4" (UID: "d3603f0e-10ee-4007-a5b0-83dfacefa9d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.360711 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk" (OuterVolumeSpecName: "kube-api-access-4r8nk") pod "d3603f0e-10ee-4007-a5b0-83dfacefa9d4" (UID: "d3603f0e-10ee-4007-a5b0-83dfacefa9d4"). InnerVolumeSpecName "kube-api-access-4r8nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.365126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" event={"ID":"d3603f0e-10ee-4007-a5b0-83dfacefa9d4","Type":"ContainerDied","Data":"665b60c0f41bb7db3019ce9ce83fea241f9360073206b9a0c07778b15b239903"} Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.365170 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="665b60c0f41bb7db3019ce9ce83fea241f9360073206b9a0c07778b15b239903" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.365286 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c7l2k" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.366250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f" (OuterVolumeSpecName: "kube-api-access-v929f") pod "57dc5163-22bf-4b59-a601-2ca96749dead" (UID: "57dc5163-22bf-4b59-a601-2ca96749dead"). InnerVolumeSpecName "kube-api-access-v929f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.452932 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v929f\" (UniqueName: \"kubernetes.io/projected/57dc5163-22bf-4b59-a601-2ca96749dead-kube-api-access-v929f\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.452971 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dc5163-22bf-4b59-a601-2ca96749dead-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.452981 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.452991 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r8nk\" (UniqueName: \"kubernetes.io/projected/d3603f0e-10ee-4007-a5b0-83dfacefa9d4-kube-api-access-4r8nk\") on node \"crc\" DevicePath \"\"" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.490761 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-blfw6"] Mar 14 05:50:58 crc kubenswrapper[4713]: E0314 05:50:58.491233 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dc5163-22bf-4b59-a601-2ca96749dead" containerName="mariadb-account-create-update" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.491250 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dc5163-22bf-4b59-a601-2ca96749dead" containerName="mariadb-account-create-update" Mar 14 05:50:58 crc kubenswrapper[4713]: E0314 05:50:58.491294 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3603f0e-10ee-4007-a5b0-83dfacefa9d4" containerName="mariadb-database-create" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.491301 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3603f0e-10ee-4007-a5b0-83dfacefa9d4" containerName="mariadb-database-create" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.491496 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3603f0e-10ee-4007-a5b0-83dfacefa9d4" containerName="mariadb-database-create" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.491521 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dc5163-22bf-4b59-a601-2ca96749dead" containerName="mariadb-account-create-update" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.492335 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.560258 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.560869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmd5j\" (UniqueName: \"kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.583166 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-blfw6"] Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.629138 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a8cb-account-create-update-jbn8g"] Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.630613 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.634563 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.641586 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a8cb-account-create-update-jbn8g"] Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.650490 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qc8rx"] Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.666676 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccjpj\" (UniqueName: \"kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.666778 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmd5j\" (UniqueName: \"kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.666845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.667110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.684143 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.692758 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmd5j\" (UniqueName: \"kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j\") pod \"glance-db-create-blfw6\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.769472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccjpj\" (UniqueName: \"kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.769614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.770574 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.796226 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccjpj\" (UniqueName: \"kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj\") pod \"glance-a8cb-account-create-update-jbn8g\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.842145 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x8vvq"] Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.852850 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.879564 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-blfw6" Mar 14 05:50:58 crc kubenswrapper[4713]: I0314 05:50:58.957865 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.241603 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bw7wg"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.243790 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.262013 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bw7wg"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.295924 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqqv\" (UniqueName: \"kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.296142 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.395165 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x8vvq" event={"ID":"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861","Type":"ContainerStarted","Data":"276d9efc8cd381804c5c003e9a3bb759293359caec8cbc91dee86d7bc6e67293"} Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.398721 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bb1d-account-create-update-btnkl"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.401402 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.410152 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.413629 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-a0d1-account-create-update-nzqcb" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.413640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qc8rx" event={"ID":"1341c453-d963-4a43-a264-0f94dd02b7dd","Type":"ContainerStarted","Data":"46e9b1560b1aa461703bdf8be388baee4a8165314f2d4397e10e05affc9a138c"} Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.427737 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.427965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqqv\" (UniqueName: \"kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.429612 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.439316 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.505090 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bb1d-account-create-update-btnkl"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.510516 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqqv\" (UniqueName: \"kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv\") pod \"keystone-db-create-bw7wg\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.529564 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.529693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qns6v\" (UniqueName: \"kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.572291 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bw7wg" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.622338 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5m68v"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.624223 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.631591 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.631701 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qns6v\" (UniqueName: \"kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.646824 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.657192 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5m68v"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.677915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qns6v\" (UniqueName: \"kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v\") pod \"keystone-bb1d-account-create-update-btnkl\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.686479 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-blfw6"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.712241 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.733889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.734011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4kcb\" (UniqueName: \"kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.734035 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: E0314 05:50:59.738950 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:50:59 crc kubenswrapper[4713]: E0314 05:50:59.739016 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:50:59 crc kubenswrapper[4713]: E0314 05:50:59.739085 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:51:07.73904597 +0000 UTC m=+1450.826955270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.762308 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.770906 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b8a0-account-create-update-rxmjz"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.773170 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.792669 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.793790 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b8a0-account-create-update-rxmjz"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.828474 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a8cb-account-create-update-jbn8g"] Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.839879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4kcb\" (UniqueName: \"kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.839949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.841830 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.870698 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4kcb\" (UniqueName: \"kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb\") pod \"placement-db-create-5m68v\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " pod="openstack/placement-db-create-5m68v" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.945496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:50:59 crc kubenswrapper[4713]: I0314 05:50:59.945549 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n67r8\" (UniqueName: \"kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.008562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5m68v" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.048922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.048998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n67r8\" (UniqueName: \"kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.062879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.071877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n67r8\" (UniqueName: \"kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8\") pod \"placement-b8a0-account-create-update-rxmjz\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.112076 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.283966 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bw7wg"] Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.445390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-blfw6" event={"ID":"b5774e38-4c05-4f83-9b36-714a22a03d0d","Type":"ContainerStarted","Data":"0dc107776430eff583810de89db07d5c153c39c58d73ecec1ed91d7dd0d5927e"} Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.448028 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bw7wg" event={"ID":"1024d39c-ff15-450a-a55d-c0d673c3a8de","Type":"ContainerStarted","Data":"69725a7afdc7f2dad3486a48564d3e40e2e01458d6484c4ad99abd856c89db4b"} Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.451069 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca274e3b-b1c1-4083-8a05-7b9a536fe088","Type":"ContainerStarted","Data":"b67249f26abb622c02e9645da4b3967792ea879acf348df8856feb3358cb277b"} Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.453154 4713 generic.go:334] "Generic (PLEG): container finished" podID="093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" containerID="04c7abccf019758bc4e9c5a0ffe67663b19f2a49a307a6a4dac9ba5787cbe312" exitCode=0 Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.454253 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x8vvq" event={"ID":"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861","Type":"ContainerDied","Data":"04c7abccf019758bc4e9c5a0ffe67663b19f2a49a307a6a4dac9ba5787cbe312"} Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.460426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8cb-account-create-update-jbn8g" event={"ID":"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1","Type":"ContainerStarted","Data":"fd518efc59cc90c77b7f69f0fca92e0d5b6262d551a9587e75c0166c3805608a"} Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.466252 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bb1d-account-create-update-btnkl"] Mar 14 05:51:00 crc kubenswrapper[4713]: W0314 05:51:00.499757 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86240a8e_7ae2_4cd5_a608_ed6986152ef9.slice/crio-905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae WatchSource:0}: Error finding container 905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae: Status 404 returned error can't find the container with id 905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.705571 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5m68v"] Mar 14 05:51:00 crc kubenswrapper[4713]: W0314 05:51:00.735053 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod597dff88_51ea_4b05_8548_c11611e05914.slice/crio-c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da WatchSource:0}: Error finding container c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da: Status 404 returned error can't find the container with id c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da Mar 14 05:51:00 crc kubenswrapper[4713]: I0314 05:51:00.943557 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b8a0-account-create-update-rxmjz"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.170103 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.171606 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.192868 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.287114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.288337 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm8lt\" (UniqueName: \"kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.390514 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.390578 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm8lt\" (UniqueName: \"kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.391295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.398389 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.419018 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-03ff-account-create-update-wrwt9"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.420592 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm8lt\" (UniqueName: \"kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt\") pod \"mysqld-exporter-openstack-cell1-db-create-mhq2v\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.423506 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.427361 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.497496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mzt\" (UniqueName: \"kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.498231 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.510301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8a0-account-create-update-rxmjz" event={"ID":"7493cc88-066e-42ce-880d-5544ba4b0b39","Type":"ContainerStarted","Data":"03f656d121c7b385c651f978df47e7b8d3a9bd88e9828ea906180c09be2d8d02"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.510379 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8a0-account-create-update-rxmjz" event={"ID":"7493cc88-066e-42ce-880d-5544ba4b0b39","Type":"ContainerStarted","Data":"3db1e7df1a00b430c91013bc10e55441bf75524d893eba25646f9f1c96cdcf77"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.537713 4713 generic.go:334] "Generic (PLEG): container finished" podID="b5774e38-4c05-4f83-9b36-714a22a03d0d" containerID="3eaa43710026ffc438f6bc547e3e6ca4a5f16f637dced7e35ee2b54255c537d6" exitCode=0 Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.537908 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-blfw6" event={"ID":"b5774e38-4c05-4f83-9b36-714a22a03d0d","Type":"ContainerDied","Data":"3eaa43710026ffc438f6bc547e3e6ca4a5f16f637dced7e35ee2b54255c537d6"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.542926 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-03ff-account-create-update-wrwt9"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.553198 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5m68v" event={"ID":"597dff88-51ea-4b05-8548-c11611e05914","Type":"ContainerStarted","Data":"279157b447635d024fadc3bd2a98465d4867b3b3b30d9555f8b42045f4909f73"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.553271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5m68v" event={"ID":"597dff88-51ea-4b05-8548-c11611e05914","Type":"ContainerStarted","Data":"c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.569771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.575053 4713 generic.go:334] "Generic (PLEG): container finished" podID="1024d39c-ff15-450a-a55d-c0d673c3a8de" containerID="88e7070f953f9280c7f2352f2b58f94f91f7a04459a7c836d939181da0645465" exitCode=0 Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.602830 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mzt\" (UniqueName: \"kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.603008 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.603838 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.607147 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.607297 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bw7wg" event={"ID":"1024d39c-ff15-450a-a55d-c0d673c3a8de","Type":"ContainerDied","Data":"88e7070f953f9280c7f2352f2b58f94f91f7a04459a7c836d939181da0645465"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.607812 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca274e3b-b1c1-4083-8a05-7b9a536fe088","Type":"ContainerStarted","Data":"9447d019675d1747b64424e68705f56e957aba967ac0e6d7aefdefd7f03e6152"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.628046 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb1d-account-create-update-btnkl" event={"ID":"86240a8e-7ae2-4cd5-a608-ed6986152ef9","Type":"ContainerStarted","Data":"124fa9d6943941875505a667614e4fbc8701d12fa3947c9be1d76d76e9aee269"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.628092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb1d-account-create-update-btnkl" event={"ID":"86240a8e-7ae2-4cd5-a608-ed6986152ef9","Type":"ContainerStarted","Data":"905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.629110 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.629348 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="dnsmasq-dns" containerID="cri-o://07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616" gracePeriod=10 Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.637679 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mzt\" (UniqueName: \"kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt\") pod \"mysqld-exporter-03ff-account-create-update-wrwt9\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.652265 4713 generic.go:334] "Generic (PLEG): container finished" podID="0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" containerID="99284074a67e217fd7bba1e817575f53c525035b8bf7dad3ca0327acde0dac45" exitCode=0 Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.652428 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8cb-account-create-update-jbn8g" event={"ID":"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1","Type":"ContainerDied","Data":"99284074a67e217fd7bba1e817575f53c525035b8bf7dad3ca0327acde0dac45"} Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.665957 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b8a0-account-create-update-rxmjz" podStartSLOduration=2.6659336270000003 podStartE2EDuration="2.665933627s" podCreationTimestamp="2026-03-14 05:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:01.542690041 +0000 UTC m=+1444.630599341" watchObservedRunningTime="2026-03-14 05:51:01.665933627 +0000 UTC m=+1444.753842937" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.739377 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-5m68v" podStartSLOduration=2.7393520860000002 podStartE2EDuration="2.739352086s" podCreationTimestamp="2026-03-14 05:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:01.611828394 +0000 UTC m=+1444.699737704" watchObservedRunningTime="2026-03-14 05:51:01.739352086 +0000 UTC m=+1444.827261396" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.776495 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.391499331 podStartE2EDuration="11.776472649s" podCreationTimestamp="2026-03-14 05:50:50 +0000 UTC" firstStartedPulling="2026-03-14 05:50:51.681160206 +0000 UTC m=+1434.769069506" lastFinishedPulling="2026-03-14 05:50:58.066133524 +0000 UTC m=+1441.154042824" observedRunningTime="2026-03-14 05:51:01.678777087 +0000 UTC m=+1444.766686397" watchObservedRunningTime="2026-03-14 05:51:01.776472649 +0000 UTC m=+1444.864381949" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.790094 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bb1d-account-create-update-btnkl" podStartSLOduration=2.790073892 podStartE2EDuration="2.790073892s" podCreationTimestamp="2026-03-14 05:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:01.707630035 +0000 UTC m=+1444.795539335" watchObservedRunningTime="2026-03-14 05:51:01.790073892 +0000 UTC m=+1444.877983192" Mar 14 05:51:01 crc kubenswrapper[4713]: I0314 05:51:01.857741 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.385341 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x8vvq" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.422547 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts\") pod \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.422650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ghw4\" (UniqueName: \"kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4\") pod \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\" (UID: \"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861\") " Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.425575 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" (UID: "093bc5c5-8f5d-42e8-84dc-f01fd0cfc861"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.438603 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4" (OuterVolumeSpecName: "kube-api-access-8ghw4") pod "093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" (UID: "093bc5c5-8f5d-42e8-84dc-f01fd0cfc861"). InnerVolumeSpecName "kube-api-access-8ghw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.526077 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.526110 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ghw4\" (UniqueName: \"kubernetes.io/projected/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861-kube-api-access-8ghw4\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.600866 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.630536 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z\") pod \"4feb0fd5-d952-4323-821e-187c23e16463\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.630754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc\") pod \"4feb0fd5-d952-4323-821e-187c23e16463\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.630827 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config\") pod \"4feb0fd5-d952-4323-821e-187c23e16463\" (UID: \"4feb0fd5-d952-4323-821e-187c23e16463\") " Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.646587 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z" (OuterVolumeSpecName: "kube-api-access-bnj9z") pod "4feb0fd5-d952-4323-821e-187c23e16463" (UID: "4feb0fd5-d952-4323-821e-187c23e16463"). InnerVolumeSpecName "kube-api-access-bnj9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.697639 4713 generic.go:334] "Generic (PLEG): container finished" podID="4feb0fd5-d952-4323-821e-187c23e16463" containerID="07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616" exitCode=0 Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.698038 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" event={"ID":"4feb0fd5-d952-4323-821e-187c23e16463","Type":"ContainerDied","Data":"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.698178 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" event={"ID":"4feb0fd5-d952-4323-821e-187c23e16463","Type":"ContainerDied","Data":"352262f82231a0a19c0ac6d9c1659d4556edf977d3a059b82bbb7c03d25b83d8"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.698362 4713 scope.go:117] "RemoveContainer" containerID="07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.698711 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jg97k" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.702355 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4feb0fd5-d952-4323-821e-187c23e16463" (UID: "4feb0fd5-d952-4323-821e-187c23e16463"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.726964 4713 generic.go:334] "Generic (PLEG): container finished" podID="7493cc88-066e-42ce-880d-5544ba4b0b39" containerID="03f656d121c7b385c651f978df47e7b8d3a9bd88e9828ea906180c09be2d8d02" exitCode=0 Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.727466 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8a0-account-create-update-rxmjz" event={"ID":"7493cc88-066e-42ce-880d-5544ba4b0b39","Type":"ContainerDied","Data":"03f656d121c7b385c651f978df47e7b8d3a9bd88e9828ea906180c09be2d8d02"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.739149 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnj9z\" (UniqueName: \"kubernetes.io/projected/4feb0fd5-d952-4323-821e-187c23e16463-kube-api-access-bnj9z\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.739190 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.745291 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config" (OuterVolumeSpecName: "config") pod "4feb0fd5-d952-4323-821e-187c23e16463" (UID: "4feb0fd5-d952-4323-821e-187c23e16463"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.760654 4713 generic.go:334] "Generic (PLEG): container finished" podID="597dff88-51ea-4b05-8548-c11611e05914" containerID="279157b447635d024fadc3bd2a98465d4867b3b3b30d9555f8b42045f4909f73" exitCode=0 Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.761520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5m68v" event={"ID":"597dff88-51ea-4b05-8548-c11611e05914","Type":"ContainerDied","Data":"279157b447635d024fadc3bd2a98465d4867b3b3b30d9555f8b42045f4909f73"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.774645 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x8vvq" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.774631 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x8vvq" event={"ID":"093bc5c5-8f5d-42e8-84dc-f01fd0cfc861","Type":"ContainerDied","Data":"276d9efc8cd381804c5c003e9a3bb759293359caec8cbc91dee86d7bc6e67293"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.774854 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276d9efc8cd381804c5c003e9a3bb759293359caec8cbc91dee86d7bc6e67293" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.798110 4713 generic.go:334] "Generic (PLEG): container finished" podID="86240a8e-7ae2-4cd5-a608-ed6986152ef9" containerID="124fa9d6943941875505a667614e4fbc8701d12fa3947c9be1d76d76e9aee269" exitCode=0 Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.799130 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb1d-account-create-update-btnkl" event={"ID":"86240a8e-7ae2-4cd5-a608-ed6986152ef9","Type":"ContainerDied","Data":"124fa9d6943941875505a667614e4fbc8701d12fa3947c9be1d76d76e9aee269"} Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.843322 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4feb0fd5-d952-4323-821e-187c23e16463-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:02 crc kubenswrapper[4713]: I0314 05:51:02.906802 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v"] Mar 14 05:51:03 crc kubenswrapper[4713]: I0314 05:51:03.029027 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-03ff-account-create-update-wrwt9"] Mar 14 05:51:03 crc kubenswrapper[4713]: I0314 05:51:03.082302 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:51:03 crc kubenswrapper[4713]: I0314 05:51:03.117551 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jg97k"] Mar 14 05:51:03 crc kubenswrapper[4713]: I0314 05:51:03.610169 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4feb0fd5-d952-4323-821e-187c23e16463" path="/var/lib/kubelet/pods/4feb0fd5-d952-4323-821e-187c23e16463/volumes" Mar 14 05:51:04 crc kubenswrapper[4713]: W0314 05:51:04.615880 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf50144ed_c64d_4618_9ac5_afcf5aea1812.slice/crio-d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4 WatchSource:0}: Error finding container d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4: Status 404 returned error can't find the container with id d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4 Mar 14 05:51:04 crc kubenswrapper[4713]: W0314 05:51:04.627849 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f655395_3e0b_452b_a92a_16ac7edb5707.slice/crio-da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395 WatchSource:0}: Error finding container da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395: Status 404 returned error can't find the container with id da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395 Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.745071 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.745880 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bw7wg" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.756301 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.786092 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.788600 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-blfw6" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.795912 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5m68v" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqqv\" (UniqueName: \"kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv\") pod \"1024d39c-ff15-450a-a55d-c0d673c3a8de\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838132 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts\") pod \"b5774e38-4c05-4f83-9b36-714a22a03d0d\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bw7wg" event={"ID":"1024d39c-ff15-450a-a55d-c0d673c3a8de","Type":"ContainerDied","Data":"69725a7afdc7f2dad3486a48564d3e40e2e01458d6484c4ad99abd856c89db4b"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838188 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69725a7afdc7f2dad3486a48564d3e40e2e01458d6484c4ad99abd856c89db4b" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838162 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts\") pod \"7493cc88-066e-42ce-880d-5544ba4b0b39\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838343 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts\") pod \"1024d39c-ff15-450a-a55d-c0d673c3a8de\" (UID: \"1024d39c-ff15-450a-a55d-c0d673c3a8de\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838416 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts\") pod \"597dff88-51ea-4b05-8548-c11611e05914\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qns6v\" (UniqueName: \"kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v\") pod \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838505 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccjpj\" (UniqueName: \"kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj\") pod \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838531 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4kcb\" (UniqueName: \"kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb\") pod \"597dff88-51ea-4b05-8548-c11611e05914\" (UID: \"597dff88-51ea-4b05-8548-c11611e05914\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838574 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n67r8\" (UniqueName: \"kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8\") pod \"7493cc88-066e-42ce-880d-5544ba4b0b39\" (UID: \"7493cc88-066e-42ce-880d-5544ba4b0b39\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838607 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts\") pod \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\" (UID: \"86240a8e-7ae2-4cd5-a608-ed6986152ef9\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838648 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts\") pod \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\" (UID: \"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.838686 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmd5j\" (UniqueName: \"kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j\") pod \"b5774e38-4c05-4f83-9b36-714a22a03d0d\" (UID: \"b5774e38-4c05-4f83-9b36-714a22a03d0d\") " Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.847012 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7493cc88-066e-42ce-880d-5544ba4b0b39" (UID: "7493cc88-066e-42ce-880d-5544ba4b0b39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.848169 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bw7wg" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.849542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1024d39c-ff15-450a-a55d-c0d673c3a8de" (UID: "1024d39c-ff15-450a-a55d-c0d673c3a8de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.853904 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86240a8e-7ae2-4cd5-a608-ed6986152ef9" (UID: "86240a8e-7ae2-4cd5-a608-ed6986152ef9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.854130 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v" (OuterVolumeSpecName: "kube-api-access-qns6v") pod "86240a8e-7ae2-4cd5-a608-ed6986152ef9" (UID: "86240a8e-7ae2-4cd5-a608-ed6986152ef9"). InnerVolumeSpecName "kube-api-access-qns6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.856508 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8" (OuterVolumeSpecName: "kube-api-access-n67r8") pod "7493cc88-066e-42ce-880d-5544ba4b0b39" (UID: "7493cc88-066e-42ce-880d-5544ba4b0b39"). InnerVolumeSpecName "kube-api-access-n67r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.858970 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5774e38-4c05-4f83-9b36-714a22a03d0d" (UID: "b5774e38-4c05-4f83-9b36-714a22a03d0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.859509 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv" (OuterVolumeSpecName: "kube-api-access-xxqqv") pod "1024d39c-ff15-450a-a55d-c0d673c3a8de" (UID: "1024d39c-ff15-450a-a55d-c0d673c3a8de"). InnerVolumeSpecName "kube-api-access-xxqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.859563 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "597dff88-51ea-4b05-8548-c11611e05914" (UID: "597dff88-51ea-4b05-8548-c11611e05914"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.860001 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" (UID: "0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.864784 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb1d-account-create-update-btnkl" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.864756 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb1d-account-create-update-btnkl" event={"ID":"86240a8e-7ae2-4cd5-a608-ed6986152ef9","Type":"ContainerDied","Data":"905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.864920 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905b914df257a3b175cef3e5089cfe88ae56d8833054fe7c8ae569d3383886ae" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.865336 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb" (OuterVolumeSpecName: "kube-api-access-x4kcb") pod "597dff88-51ea-4b05-8548-c11611e05914" (UID: "597dff88-51ea-4b05-8548-c11611e05914"). InnerVolumeSpecName "kube-api-access-x4kcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.868758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" event={"ID":"f50144ed-c64d-4618-9ac5-afcf5aea1812","Type":"ContainerStarted","Data":"d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.870716 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8cb-account-create-update-jbn8g" event={"ID":"0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1","Type":"ContainerDied","Data":"fd518efc59cc90c77b7f69f0fca92e0d5b6262d551a9587e75c0166c3805608a"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.870744 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd518efc59cc90c77b7f69f0fca92e0d5b6262d551a9587e75c0166c3805608a" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.870819 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8cb-account-create-update-jbn8g" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.875011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8a0-account-create-update-rxmjz" event={"ID":"7493cc88-066e-42ce-880d-5544ba4b0b39","Type":"ContainerDied","Data":"3db1e7df1a00b430c91013bc10e55441bf75524d893eba25646f9f1c96cdcf77"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.875053 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db1e7df1a00b430c91013bc10e55441bf75524d893eba25646f9f1c96cdcf77" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.875197 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8a0-account-create-update-rxmjz" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.875814 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j" (OuterVolumeSpecName: "kube-api-access-tmd5j") pod "b5774e38-4c05-4f83-9b36-714a22a03d0d" (UID: "b5774e38-4c05-4f83-9b36-714a22a03d0d"). InnerVolumeSpecName "kube-api-access-tmd5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.876480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj" (OuterVolumeSpecName: "kube-api-access-ccjpj") pod "0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" (UID: "0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1"). InnerVolumeSpecName "kube-api-access-ccjpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.893133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" event={"ID":"5f655395-3e0b-452b-a92a-16ac7edb5707","Type":"ContainerStarted","Data":"da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.897137 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7493cc88-066e-42ce-880d-5544ba4b0b39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.897399 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1024d39c-ff15-450a-a55d-c0d673c3a8de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.897436 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qns6v\" (UniqueName: \"kubernetes.io/projected/86240a8e-7ae2-4cd5-a608-ed6986152ef9-kube-api-access-qns6v\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.897454 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n67r8\" (UniqueName: \"kubernetes.io/projected/7493cc88-066e-42ce-880d-5544ba4b0b39-kube-api-access-n67r8\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.897475 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86240a8e-7ae2-4cd5-a608-ed6986152ef9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.915370 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-blfw6" event={"ID":"b5774e38-4c05-4f83-9b36-714a22a03d0d","Type":"ContainerDied","Data":"0dc107776430eff583810de89db07d5c153c39c58d73ecec1ed91d7dd0d5927e"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.915428 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc107776430eff583810de89db07d5c153c39c58d73ecec1ed91d7dd0d5927e" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.915470 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-blfw6" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.927608 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5m68v" event={"ID":"597dff88-51ea-4b05-8548-c11611e05914","Type":"ContainerDied","Data":"c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da"} Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.927649 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c200179c0c3fa5f74d3c146751fa22342996bfe82d548cbdbf5c8c23380231da" Mar 14 05:51:04 crc kubenswrapper[4713]: I0314 05:51:04.927735 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5m68v" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000878 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/597dff88-51ea-4b05-8548-c11611e05914-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000936 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccjpj\" (UniqueName: \"kubernetes.io/projected/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-kube-api-access-ccjpj\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000952 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4kcb\" (UniqueName: \"kubernetes.io/projected/597dff88-51ea-4b05-8548-c11611e05914-kube-api-access-x4kcb\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000966 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000980 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmd5j\" (UniqueName: \"kubernetes.io/projected/b5774e38-4c05-4f83-9b36-714a22a03d0d-kube-api-access-tmd5j\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.000993 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqqv\" (UniqueName: \"kubernetes.io/projected/1024d39c-ff15-450a-a55d-c0d673c3a8de-kube-api-access-xxqqv\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:05 crc kubenswrapper[4713]: I0314 05:51:05.001005 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5774e38-4c05-4f83-9b36-714a22a03d0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.433142 4713 scope.go:117] "RemoveContainer" containerID="74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.521940 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x8vvq"] Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.533268 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x8vvq"] Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541120 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-25kgx"] Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541622 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5774e38-4c05-4f83-9b36-714a22a03d0d" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541634 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5774e38-4c05-4f83-9b36-714a22a03d0d" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541644 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="init" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541650 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="init" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541659 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86240a8e-7ae2-4cd5-a608-ed6986152ef9" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541665 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="86240a8e-7ae2-4cd5-a608-ed6986152ef9" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541684 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541689 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541702 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="dnsmasq-dns" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541708 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="dnsmasq-dns" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541721 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7493cc88-066e-42ce-880d-5544ba4b0b39" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541727 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7493cc88-066e-42ce-880d-5544ba4b0b39" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541747 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1024d39c-ff15-450a-a55d-c0d673c3a8de" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541753 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1024d39c-ff15-450a-a55d-c0d673c3a8de" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541837 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597dff88-51ea-4b05-8548-c11611e05914" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541844 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="597dff88-51ea-4b05-8548-c11611e05914" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: E0314 05:51:06.541856 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.541862 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542075 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7493cc88-066e-42ce-880d-5544ba4b0b39" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542088 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4feb0fd5-d952-4323-821e-187c23e16463" containerName="dnsmasq-dns" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542098 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542109 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="597dff88-51ea-4b05-8548-c11611e05914" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542126 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1024d39c-ff15-450a-a55d-c0d673c3a8de" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542186 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="86240a8e-7ae2-4cd5-a608-ed6986152ef9" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542195 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" containerName="mariadb-account-create-update" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542202 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5774e38-4c05-4f83-9b36-714a22a03d0d" containerName="mariadb-database-create" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.542991 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.545282 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.550992 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-25kgx"] Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.640008 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cmh\" (UniqueName: \"kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.640641 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.743263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4cmh\" (UniqueName: \"kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.743327 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.743947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.769525 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4cmh\" (UniqueName: \"kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh\") pod \"root-account-create-update-25kgx\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:06 crc kubenswrapper[4713]: I0314 05:51:06.879030 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:07 crc kubenswrapper[4713]: I0314 05:51:07.578904 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093bc5c5-8f5d-42e8-84dc-f01fd0cfc861" path="/var/lib/kubelet/pods/093bc5c5-8f5d-42e8-84dc-f01fd0cfc861/volumes" Mar 14 05:51:07 crc kubenswrapper[4713]: I0314 05:51:07.782020 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:51:07 crc kubenswrapper[4713]: E0314 05:51:07.783159 4713 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 05:51:07 crc kubenswrapper[4713]: E0314 05:51:07.783183 4713 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 05:51:07 crc kubenswrapper[4713]: E0314 05:51:07.783248 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift podName:80f03c3b-d224-4e9d-8e52-e0376b3f215f nodeName:}" failed. No retries permitted until 2026-03-14 05:51:23.783233139 +0000 UTC m=+1466.871142439 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift") pod "swift-storage-0" (UID: "80f03c3b-d224-4e9d-8e52-e0376b3f215f") : configmap "swift-ring-files" not found Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.005782 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-999d8b566-8h8bm" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" containerName="console" containerID="cri-o://bb3065bb5c298c9255c05eea1c69f4c10236bf269c53c3c06af989819c461aa8" gracePeriod=15 Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.352106 4713 patch_prober.go:28] interesting pod/console-999d8b566-8h8bm container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.94:8443/health\": dial tcp 10.217.0.94:8443: connect: connection refused" start-of-body= Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.352169 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-999d8b566-8h8bm" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" containerName="console" probeResult="failure" output="Get \"https://10.217.0.94:8443/health\": dial tcp 10.217.0.94:8443: connect: connection refused" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.688653 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-k57vf"] Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.690307 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.696216 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.696244 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kknmp" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.706685 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k57vf"] Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.804273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.804348 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.804384 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mcsx\" (UniqueName: \"kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.804465 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.875833 4713 scope.go:117] "RemoveContainer" containerID="07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616" Mar 14 05:51:08 crc kubenswrapper[4713]: E0314 05:51:08.876681 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616\": container with ID starting with 07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616 not found: ID does not exist" containerID="07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.876747 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616"} err="failed to get container status \"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616\": rpc error: code = NotFound desc = could not find container \"07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616\": container with ID starting with 07bf9b545f3f3c6109c4272ad90628f4266c02d72462fb3e6b4b9d49b562f616 not found: ID does not exist" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.876779 4713 scope.go:117] "RemoveContainer" containerID="74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879" Mar 14 05:51:08 crc kubenswrapper[4713]: E0314 05:51:08.877099 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879\": container with ID starting with 74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879 not found: ID does not exist" containerID="74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.877133 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879"} err="failed to get container status \"74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879\": rpc error: code = NotFound desc = could not find container \"74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879\": container with ID starting with 74229848d39b8e4c7bb0e48d7310ec366bf472631d6936059beae835a2f58879 not found: ID does not exist" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.906625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.906681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mcsx\" (UniqueName: \"kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.906756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.906860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.913116 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.913406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.914396 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.928874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mcsx\" (UniqueName: \"kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx\") pod \"glance-db-sync-k57vf\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.981384 4713 generic.go:334] "Generic (PLEG): container finished" podID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerID="6d464f5b92a469a58eed5cef7c3b96f48a6b9cba513745ea0786417981a94f06" exitCode=0 Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.981465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerDied","Data":"6d464f5b92a469a58eed5cef7c3b96f48a6b9cba513745ea0786417981a94f06"} Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.987269 4713 generic.go:334] "Generic (PLEG): container finished" podID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerID="aa4a5ceb1b002ee0c5369d9f8a6166c1e3452680788305e7434abe62eb526493" exitCode=0 Mar 14 05:51:08 crc kubenswrapper[4713]: I0314 05:51:08.987358 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerDied","Data":"aa4a5ceb1b002ee0c5369d9f8a6166c1e3452680788305e7434abe62eb526493"} Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.013675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.017166 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-999d8b566-8h8bm_a79b34eb-9b98-45d3-b470-c3925639b028/console/0.log" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.017679 4713 generic.go:334] "Generic (PLEG): container finished" podID="a79b34eb-9b98-45d3-b470-c3925639b028" containerID="bb3065bb5c298c9255c05eea1c69f4c10236bf269c53c3c06af989819c461aa8" exitCode=2 Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.017780 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-999d8b566-8h8bm" event={"ID":"a79b34eb-9b98-45d3-b470-c3925639b028","Type":"ContainerDied","Data":"bb3065bb5c298c9255c05eea1c69f4c10236bf269c53c3c06af989819c461aa8"} Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.040647 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerID="d4c0f09392624ed19b999fbb5f7689557216f97a3ec3122b4ec0efdb2a5ceb6e" exitCode=0 Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.040736 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerDied","Data":"d4c0f09392624ed19b999fbb5f7689557216f97a3ec3122b4ec0efdb2a5ceb6e"} Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.072593 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerID="757454825f7b8736377c878e350329ecf32a743829e7db7e77afe631eff684e5" exitCode=0 Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.072632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerDied","Data":"757454825f7b8736377c878e350329ecf32a743829e7db7e77afe631eff684e5"} Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.446158 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-999d8b566-8h8bm_a79b34eb-9b98-45d3-b470-c3925639b028/console/0.log" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.446597 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.630819 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-25kgx"] Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640061 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640181 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640235 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xn64\" (UniqueName: \"kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640272 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640392 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640456 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.640511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert\") pod \"a79b34eb-9b98-45d3-b470-c3925639b028\" (UID: \"a79b34eb-9b98-45d3-b470-c3925639b028\") " Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.641824 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.642362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca" (OuterVolumeSpecName: "service-ca") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.643047 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config" (OuterVolumeSpecName: "console-config") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.643129 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.648147 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64" (OuterVolumeSpecName: "kube-api-access-5xn64") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "kube-api-access-5xn64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.648318 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.651638 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a79b34eb-9b98-45d3-b470-c3925639b028" (UID: "a79b34eb-9b98-45d3-b470-c3925639b028"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.743715 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744142 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a79b34eb-9b98-45d3-b470-c3925639b028-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744156 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xn64\" (UniqueName: \"kubernetes.io/projected/a79b34eb-9b98-45d3-b470-c3925639b028-kube-api-access-5xn64\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744166 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744176 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744185 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.744193 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a79b34eb-9b98-45d3-b470-c3925639b028-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:09 crc kubenswrapper[4713]: I0314 05:51:09.932862 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k57vf"] Mar 14 05:51:09 crc kubenswrapper[4713]: W0314 05:51:09.934803 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb4f6d50_5931_4dec_82ed_606d0a53fb6e.slice/crio-cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2 WatchSource:0}: Error finding container cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2: Status 404 returned error can't find the container with id cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2 Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.084364 4713 generic.go:334] "Generic (PLEG): container finished" podID="5f655395-3e0b-452b-a92a-16ac7edb5707" containerID="484baeff2604aaaea6f8e0fb3e73f32235d5013349a6e15e4dbd13359b25f5d5" exitCode=0 Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.084471 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" event={"ID":"5f655395-3e0b-452b-a92a-16ac7edb5707","Type":"ContainerDied","Data":"484baeff2604aaaea6f8e0fb3e73f32235d5013349a6e15e4dbd13359b25f5d5"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.088922 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerStarted","Data":"16cccd1c1879208b644d1ce063fc0207893ec42aa0044d8dfb2dcc4ce8e5f02a"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.089245 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.092768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qc8rx" event={"ID":"1341c453-d963-4a43-a264-0f94dd02b7dd","Type":"ContainerStarted","Data":"be875375f3aa358cdb14f24bb169618071b02d8e58728840977c3a62dc37c92a"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.095187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerStarted","Data":"cf44e192fe4ea925e6d71d88a2d22a947348208ebc497842573b83b56c8f663a"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.097002 4713 generic.go:334] "Generic (PLEG): container finished" podID="f50144ed-c64d-4618-9ac5-afcf5aea1812" containerID="900b57803c150fa7f7d5259939e95ff7c73678ab02eb25cff96418b1217ef2bf" exitCode=0 Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.097077 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" event={"ID":"f50144ed-c64d-4618-9ac5-afcf5aea1812","Type":"ContainerDied","Data":"900b57803c150fa7f7d5259939e95ff7c73678ab02eb25cff96418b1217ef2bf"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.099460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerStarted","Data":"0e7788bf9dfb7f8e79a12f2578730a7fec1eb26cd7487a7c7aa6811cba7114bf"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.100392 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.101355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k57vf" event={"ID":"eb4f6d50-5931-4dec-82ed-606d0a53fb6e","Type":"ContainerStarted","Data":"cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.104752 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25kgx" event={"ID":"c2264856-3255-43b9-9fd5-50dd1f7d5797","Type":"ContainerStarted","Data":"9eae25a1f97b97f263464a970d67909a37d58436ffb95b919d1a7b642f84de77"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.104785 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25kgx" event={"ID":"c2264856-3255-43b9-9fd5-50dd1f7d5797","Type":"ContainerStarted","Data":"9909549ec28562dea56d63244a04201565cfaeb026eb6f00753b7ad8a7e16356"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.107643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerStarted","Data":"b19fb7a7667d09e3c59d54fcbe249af2fd6a6fdc99fde2d16643240caf47febe"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.108000 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.112187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerStarted","Data":"2d261e1bd6c66bf317e7962278ac94bd87c70e5cdbe5c85550fac6df5c0a208b"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.112630 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.114755 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-999d8b566-8h8bm_a79b34eb-9b98-45d3-b470-c3925639b028/console/0.log" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.114818 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-999d8b566-8h8bm" event={"ID":"a79b34eb-9b98-45d3-b470-c3925639b028","Type":"ContainerDied","Data":"5e0ed0bd91dc24dcffc9cd139d535ecd7ad69afe9428e56b270bfdbe94068e61"} Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.114852 4713 scope.go:117] "RemoveContainer" containerID="bb3065bb5c298c9255c05eea1c69f4c10236bf269c53c3c06af989819c461aa8" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.114999 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-999d8b566-8h8bm" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.156438 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.481826699 podStartE2EDuration="56.156420611s" podCreationTimestamp="2026-03-14 05:50:14 +0000 UTC" firstStartedPulling="2026-03-14 05:50:16.669332792 +0000 UTC m=+1399.757242092" lastFinishedPulling="2026-03-14 05:50:34.343926714 +0000 UTC m=+1417.431836004" observedRunningTime="2026-03-14 05:51:10.151126403 +0000 UTC m=+1453.239035713" watchObservedRunningTime="2026-03-14 05:51:10.156420611 +0000 UTC m=+1453.244329901" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.171340 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-25kgx" podStartSLOduration=4.171315386 podStartE2EDuration="4.171315386s" podCreationTimestamp="2026-03-14 05:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:10.169011853 +0000 UTC m=+1453.256921153" watchObservedRunningTime="2026-03-14 05:51:10.171315386 +0000 UTC m=+1453.259224686" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.200000 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.992454522 podStartE2EDuration="57.199977079s" podCreationTimestamp="2026-03-14 05:50:13 +0000 UTC" firstStartedPulling="2026-03-14 05:50:16.266551604 +0000 UTC m=+1399.354460904" lastFinishedPulling="2026-03-14 05:50:34.474074161 +0000 UTC m=+1417.561983461" observedRunningTime="2026-03-14 05:51:10.195128715 +0000 UTC m=+1453.283038015" watchObservedRunningTime="2026-03-14 05:51:10.199977079 +0000 UTC m=+1453.287886389" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.216388 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qc8rx" podStartSLOduration=4.978352243 podStartE2EDuration="15.216369871s" podCreationTimestamp="2026-03-14 05:50:55 +0000 UTC" firstStartedPulling="2026-03-14 05:50:58.683556931 +0000 UTC m=+1441.771466231" lastFinishedPulling="2026-03-14 05:51:08.921574549 +0000 UTC m=+1452.009483859" observedRunningTime="2026-03-14 05:51:10.214978518 +0000 UTC m=+1453.302887818" watchObservedRunningTime="2026-03-14 05:51:10.216369871 +0000 UTC m=+1453.304279171" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.274450 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.119449738 podStartE2EDuration="56.274431671s" podCreationTimestamp="2026-03-14 05:50:14 +0000 UTC" firstStartedPulling="2026-03-14 05:50:16.356742079 +0000 UTC m=+1399.444651379" lastFinishedPulling="2026-03-14 05:50:34.511724012 +0000 UTC m=+1417.599633312" observedRunningTime="2026-03-14 05:51:10.25243166 +0000 UTC m=+1453.340340960" watchObservedRunningTime="2026-03-14 05:51:10.274431671 +0000 UTC m=+1453.362340971" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.312220 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=38.180093041 podStartE2EDuration="56.312177733s" podCreationTimestamp="2026-03-14 05:50:14 +0000 UTC" firstStartedPulling="2026-03-14 05:50:16.359238429 +0000 UTC m=+1399.447147729" lastFinishedPulling="2026-03-14 05:50:34.491323121 +0000 UTC m=+1417.579232421" observedRunningTime="2026-03-14 05:51:10.297055061 +0000 UTC m=+1453.384964371" watchObservedRunningTime="2026-03-14 05:51:10.312177733 +0000 UTC m=+1453.400087033" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.340225 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.354930 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-999d8b566-8h8bm"] Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.572667 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.733058 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:51:10 crc kubenswrapper[4713]: I0314 05:51:10.733109 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.137095 4713 generic.go:334] "Generic (PLEG): container finished" podID="c2264856-3255-43b9-9fd5-50dd1f7d5797" containerID="9eae25a1f97b97f263464a970d67909a37d58436ffb95b919d1a7b642f84de77" exitCode=0 Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.137179 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25kgx" event={"ID":"c2264856-3255-43b9-9fd5-50dd1f7d5797","Type":"ContainerDied","Data":"9eae25a1f97b97f263464a970d67909a37d58436ffb95b919d1a7b642f84de77"} Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.579404 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" path="/var/lib/kubelet/pods/a79b34eb-9b98-45d3-b470-c3925639b028/volumes" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.766477 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.775676 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.895782 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm8lt\" (UniqueName: \"kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt\") pod \"5f655395-3e0b-452b-a92a-16ac7edb5707\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.895836 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts\") pod \"f50144ed-c64d-4618-9ac5-afcf5aea1812\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.895875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts\") pod \"5f655395-3e0b-452b-a92a-16ac7edb5707\" (UID: \"5f655395-3e0b-452b-a92a-16ac7edb5707\") " Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.895948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6mzt\" (UniqueName: \"kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt\") pod \"f50144ed-c64d-4618-9ac5-afcf5aea1812\" (UID: \"f50144ed-c64d-4618-9ac5-afcf5aea1812\") " Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.896654 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f50144ed-c64d-4618-9ac5-afcf5aea1812" (UID: "f50144ed-c64d-4618-9ac5-afcf5aea1812"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.896665 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f655395-3e0b-452b-a92a-16ac7edb5707" (UID: "5f655395-3e0b-452b-a92a-16ac7edb5707"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.896924 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f50144ed-c64d-4618-9ac5-afcf5aea1812-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.896943 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f655395-3e0b-452b-a92a-16ac7edb5707-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.907798 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt" (OuterVolumeSpecName: "kube-api-access-p6mzt") pod "f50144ed-c64d-4618-9ac5-afcf5aea1812" (UID: "f50144ed-c64d-4618-9ac5-afcf5aea1812"). InnerVolumeSpecName "kube-api-access-p6mzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.914782 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt" (OuterVolumeSpecName: "kube-api-access-mm8lt") pod "5f655395-3e0b-452b-a92a-16ac7edb5707" (UID: "5f655395-3e0b-452b-a92a-16ac7edb5707"). InnerVolumeSpecName "kube-api-access-mm8lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.998849 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm8lt\" (UniqueName: \"kubernetes.io/projected/5f655395-3e0b-452b-a92a-16ac7edb5707-kube-api-access-mm8lt\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:11 crc kubenswrapper[4713]: I0314 05:51:11.998885 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6mzt\" (UniqueName: \"kubernetes.io/projected/f50144ed-c64d-4618-9ac5-afcf5aea1812-kube-api-access-p6mzt\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.153955 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" event={"ID":"f50144ed-c64d-4618-9ac5-afcf5aea1812","Type":"ContainerDied","Data":"d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4"} Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.154287 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d779eafe18a9eb55f5d61b785535c2750bdb09366745108c2e30f9e1d5c109c4" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.154389 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-03ff-account-create-update-wrwt9" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.156338 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.156346 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v" event={"ID":"5f655395-3e0b-452b-a92a-16ac7edb5707","Type":"ContainerDied","Data":"da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395"} Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.156405 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da260d57a40209ce9d8e875f79ea8eb604038bd427b7b95df339585c2beb5395" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.549628 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.713975 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts\") pod \"c2264856-3255-43b9-9fd5-50dd1f7d5797\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.714316 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4cmh\" (UniqueName: \"kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh\") pod \"c2264856-3255-43b9-9fd5-50dd1f7d5797\" (UID: \"c2264856-3255-43b9-9fd5-50dd1f7d5797\") " Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.714436 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2264856-3255-43b9-9fd5-50dd1f7d5797" (UID: "c2264856-3255-43b9-9fd5-50dd1f7d5797"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.715026 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2264856-3255-43b9-9fd5-50dd1f7d5797-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.719008 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh" (OuterVolumeSpecName: "kube-api-access-s4cmh") pod "c2264856-3255-43b9-9fd5-50dd1f7d5797" (UID: "c2264856-3255-43b9-9fd5-50dd1f7d5797"). InnerVolumeSpecName "kube-api-access-s4cmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:12 crc kubenswrapper[4713]: I0314 05:51:12.817801 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4cmh\" (UniqueName: \"kubernetes.io/projected/c2264856-3255-43b9-9fd5-50dd1f7d5797-kube-api-access-s4cmh\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:13 crc kubenswrapper[4713]: I0314 05:51:13.180817 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerStarted","Data":"9a8cc2ec784962e0c67d3d727ed39a2bc7569f155de51448b671d53d3931e0c1"} Mar 14 05:51:13 crc kubenswrapper[4713]: I0314 05:51:13.183598 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25kgx" event={"ID":"c2264856-3255-43b9-9fd5-50dd1f7d5797","Type":"ContainerDied","Data":"9909549ec28562dea56d63244a04201565cfaeb026eb6f00753b7ad8a7e16356"} Mar 14 05:51:13 crc kubenswrapper[4713]: I0314 05:51:13.183627 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9909549ec28562dea56d63244a04201565cfaeb026eb6f00753b7ad8a7e16356" Mar 14 05:51:13 crc kubenswrapper[4713]: I0314 05:51:13.183702 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25kgx" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.068828 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lk79w" podUID="2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca" containerName="ovn-controller" probeResult="failure" output=< Mar 14 05:51:14 crc kubenswrapper[4713]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 05:51:14 crc kubenswrapper[4713]: > Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.161298 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.173547 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6pzwm" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.415855 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lk79w-config-p69z6"] Mar 14 05:51:14 crc kubenswrapper[4713]: E0314 05:51:14.418805 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2264856-3255-43b9-9fd5-50dd1f7d5797" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.418965 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2264856-3255-43b9-9fd5-50dd1f7d5797" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: E0314 05:51:14.419090 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f655395-3e0b-452b-a92a-16ac7edb5707" containerName="mariadb-database-create" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419151 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f655395-3e0b-452b-a92a-16ac7edb5707" containerName="mariadb-database-create" Mar 14 05:51:14 crc kubenswrapper[4713]: E0314 05:51:14.419267 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50144ed-c64d-4618-9ac5-afcf5aea1812" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419365 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50144ed-c64d-4618-9ac5-afcf5aea1812" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: E0314 05:51:14.419462 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" containerName="console" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419538 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" containerName="console" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419800 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50144ed-c64d-4618-9ac5-afcf5aea1812" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419865 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2264856-3255-43b9-9fd5-50dd1f7d5797" containerName="mariadb-account-create-update" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419930 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f655395-3e0b-452b-a92a-16ac7edb5707" containerName="mariadb-database-create" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.419995 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79b34eb-9b98-45d3-b470-c3925639b028" containerName="console" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.420776 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.423224 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.433112 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lk79w-config-p69z6"] Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.560152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.560556 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.560899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.561083 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.561396 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9srj\" (UniqueName: \"kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.561618 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.665930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9srj\" (UniqueName: \"kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.666105 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.666159 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.666397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.666517 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.666722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.667331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.667562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.667722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.667745 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.668673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.699162 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9srj\" (UniqueName: \"kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj\") pod \"ovn-controller-lk79w-config-p69z6\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:14 crc kubenswrapper[4713]: I0314 05:51:14.740457 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:15 crc kubenswrapper[4713]: I0314 05:51:15.343343 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lk79w-config-p69z6"] Mar 14 05:51:15 crc kubenswrapper[4713]: W0314 05:51:15.347136 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca833874_bd20_45e4_a7cd_4ca26596492a.slice/crio-8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e WatchSource:0}: Error finding container 8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e: Status 404 returned error can't find the container with id 8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.236767 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca833874-bd20-45e4-a7cd-4ca26596492a" containerID="492f85d8eb589d263332a6812524e48f4fe17fbd7412ad95587490cc6b501a72" exitCode=0 Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.237065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lk79w-config-p69z6" event={"ID":"ca833874-bd20-45e4-a7cd-4ca26596492a","Type":"ContainerDied","Data":"492f85d8eb589d263332a6812524e48f4fe17fbd7412ad95587490cc6b501a72"} Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.237092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lk79w-config-p69z6" event={"ID":"ca833874-bd20-45e4-a7cd-4ca26596492a","Type":"ContainerStarted","Data":"8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e"} Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.516401 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.518109 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.527951 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.537543 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.612189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klgtp\" (UniqueName: \"kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.612289 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.612336 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.717562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klgtp\" (UniqueName: \"kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.717960 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.718016 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.729537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.730217 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.753934 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klgtp\" (UniqueName: \"kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp\") pod \"mysqld-exporter-0\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " pod="openstack/mysqld-exporter-0" Mar 14 05:51:16 crc kubenswrapper[4713]: I0314 05:51:16.866456 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:51:17 crc kubenswrapper[4713]: I0314 05:51:17.252171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerStarted","Data":"0840b4f89bc3a9b5375b7ebac143e59dad56ec2d53d97bf9f17d704ab78c944f"} Mar 14 05:51:17 crc kubenswrapper[4713]: I0314 05:51:17.296123 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.009293846 podStartE2EDuration="57.29610076s" podCreationTimestamp="2026-03-14 05:50:20 +0000 UTC" firstStartedPulling="2026-03-14 05:50:34.28641479 +0000 UTC m=+1417.374324090" lastFinishedPulling="2026-03-14 05:51:16.573221704 +0000 UTC m=+1459.661131004" observedRunningTime="2026-03-14 05:51:17.286827384 +0000 UTC m=+1460.374736684" watchObservedRunningTime="2026-03-14 05:51:17.29610076 +0000 UTC m=+1460.384010060" Mar 14 05:51:18 crc kubenswrapper[4713]: I0314 05:51:18.272170 4713 generic.go:334] "Generic (PLEG): container finished" podID="1341c453-d963-4a43-a264-0f94dd02b7dd" containerID="be875375f3aa358cdb14f24bb169618071b02d8e58728840977c3a62dc37c92a" exitCode=0 Mar 14 05:51:18 crc kubenswrapper[4713]: I0314 05:51:18.272277 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qc8rx" event={"ID":"1341c453-d963-4a43-a264-0f94dd02b7dd","Type":"ContainerDied","Data":"be875375f3aa358cdb14f24bb169618071b02d8e58728840977c3a62dc37c92a"} Mar 14 05:51:19 crc kubenswrapper[4713]: I0314 05:51:19.061373 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lk79w" Mar 14 05:51:21 crc kubenswrapper[4713]: I0314 05:51:21.941262 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:21 crc kubenswrapper[4713]: I0314 05:51:21.941879 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:21 crc kubenswrapper[4713]: I0314 05:51:21.945735 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:22 crc kubenswrapper[4713]: I0314 05:51:22.315889 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:23 crc kubenswrapper[4713]: I0314 05:51:23.799388 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:51:23 crc kubenswrapper[4713]: I0314 05:51:23.820610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80f03c3b-d224-4e9d-8e52-e0376b3f215f-etc-swift\") pod \"swift-storage-0\" (UID: \"80f03c3b-d224-4e9d-8e52-e0376b3f215f\") " pod="openstack/swift-storage-0" Mar 14 05:51:24 crc kubenswrapper[4713]: I0314 05:51:24.026313 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.386005 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.386326 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="prometheus" containerID="cri-o://cf44e192fe4ea925e6d71d88a2d22a947348208ebc497842573b83b56c8f663a" gracePeriod=600 Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.386824 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="thanos-sidecar" containerID="cri-o://0840b4f89bc3a9b5375b7ebac143e59dad56ec2d53d97bf9f17d704ab78c944f" gracePeriod=600 Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.386904 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="config-reloader" containerID="cri-o://9a8cc2ec784962e0c67d3d727ed39a2bc7569f155de51448b671d53d3931e0c1" gracePeriod=600 Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.592412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.662450 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.726393 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 14 05:51:25 crc kubenswrapper[4713]: I0314 05:51:25.998739 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.137:5671: connect: connection refused" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.380074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qc8rx" event={"ID":"1341c453-d963-4a43-a264-0f94dd02b7dd","Type":"ContainerDied","Data":"46e9b1560b1aa461703bdf8be388baee4a8165314f2d4397e10e05affc9a138c"} Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.380361 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e9b1560b1aa461703bdf8be388baee4a8165314f2d4397e10e05affc9a138c" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384430 4713 generic.go:334] "Generic (PLEG): container finished" podID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerID="0840b4f89bc3a9b5375b7ebac143e59dad56ec2d53d97bf9f17d704ab78c944f" exitCode=0 Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384472 4713 generic.go:334] "Generic (PLEG): container finished" podID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerID="9a8cc2ec784962e0c67d3d727ed39a2bc7569f155de51448b671d53d3931e0c1" exitCode=0 Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384487 4713 generic.go:334] "Generic (PLEG): container finished" podID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerID="cf44e192fe4ea925e6d71d88a2d22a947348208ebc497842573b83b56c8f663a" exitCode=0 Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384539 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerDied","Data":"0840b4f89bc3a9b5375b7ebac143e59dad56ec2d53d97bf9f17d704ab78c944f"} Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerDied","Data":"9a8cc2ec784962e0c67d3d727ed39a2bc7569f155de51448b671d53d3931e0c1"} Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.384578 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerDied","Data":"cf44e192fe4ea925e6d71d88a2d22a947348208ebc497842573b83b56c8f663a"} Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.388073 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lk79w-config-p69z6" event={"ID":"ca833874-bd20-45e4-a7cd-4ca26596492a","Type":"ContainerDied","Data":"8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e"} Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.388106 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c00ed4c9bc423ee7741b36c60f9f84980cc946b5095f073a3f246ecdf37e81e" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.450114 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.473652 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611190 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9r4\" (UniqueName: \"kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611543 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611574 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611675 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611714 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611766 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611827 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9srj\" (UniqueName: \"kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf\") pod \"1341c453-d963-4a43-a264-0f94dd02b7dd\" (UID: \"1341c453-d963-4a43-a264-0f94dd02b7dd\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611918 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.611960 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.612009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts\") pod \"ca833874-bd20-45e4-a7cd-4ca26596492a\" (UID: \"ca833874-bd20-45e4-a7cd-4ca26596492a\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.613172 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.628186 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.628246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run" (OuterVolumeSpecName: "var-run") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.628896 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.629876 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts" (OuterVolumeSpecName: "scripts") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.629919 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.632918 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.634850 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4" (OuterVolumeSpecName: "kube-api-access-fb9r4") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "kube-api-access-fb9r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.642229 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj" (OuterVolumeSpecName: "kube-api-access-p9srj") pod "ca833874-bd20-45e4-a7cd-4ca26596492a" (UID: "ca833874-bd20-45e4-a7cd-4ca26596492a"). InnerVolumeSpecName "kube-api-access-p9srj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.661426 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts" (OuterVolumeSpecName: "scripts") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.663495 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.685627 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.697021 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714444 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9r4\" (UniqueName: \"kubernetes.io/projected/1341c453-d963-4a43-a264-0f94dd02b7dd-kube-api-access-fb9r4\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714601 4713 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714702 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1341c453-d963-4a43-a264-0f94dd02b7dd-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714783 4713 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1341c453-d963-4a43-a264-0f94dd02b7dd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714873 4713 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.714960 4713 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715036 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9srj\" (UniqueName: \"kubernetes.io/projected/ca833874-bd20-45e4-a7cd-4ca26596492a-kube-api-access-p9srj\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715107 4713 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715192 4713 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715293 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715372 4713 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca833874-bd20-45e4-a7cd-4ca26596492a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.715454 4713 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca833874-bd20-45e4-a7cd-4ca26596492a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.735520 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1341c453-d963-4a43-a264-0f94dd02b7dd" (UID: "1341c453-d963-4a43-a264-0f94dd02b7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.817781 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.817901 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.817991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818012 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818042 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818080 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818112 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818143 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn6p7\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818221 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config\") pod \"8eea30f2-9f63-4f93-a711-48fee1d631c2\" (UID: \"8eea30f2-9f63-4f93-a711-48fee1d631c2\") " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.818834 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1341c453-d963-4a43-a264-0f94dd02b7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.819615 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.821462 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.821494 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.824237 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.824801 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out" (OuterVolumeSpecName: "config-out") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.825181 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config" (OuterVolumeSpecName: "config") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.828193 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.830576 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7" (OuterVolumeSpecName: "kube-api-access-kn6p7") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "kube-api-access-kn6p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.851011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config" (OuterVolumeSpecName: "web-config") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.857280 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8eea30f2-9f63-4f93-a711-48fee1d631c2" (UID: "8eea30f2-9f63-4f93-a711-48fee1d631c2"). InnerVolumeSpecName "pvc-1fe7653e-2622-4864-bb53-7a533f3742d8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921089 4713 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921141 4713 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921158 4713 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921172 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn6p7\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-kube-api-access-kn6p7\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921265 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921278 4713 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8eea30f2-9f63-4f93-a711-48fee1d631c2-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921543 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") on node \"crc\" " Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921568 4713 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8eea30f2-9f63-4f93-a711-48fee1d631c2-config-out\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.921594 4713 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8eea30f2-9f63-4f93-a711-48fee1d631c2-web-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.922157 4713 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8eea30f2-9f63-4f93-a711-48fee1d631c2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.941591 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.957421 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.973679 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:51:26 crc kubenswrapper[4713]: I0314 05:51:26.973854 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1fe7653e-2622-4864-bb53-7a533f3742d8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8") on node "crc" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.023533 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.163354 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 05:51:27 crc kubenswrapper[4713]: W0314 05:51:27.163649 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f03c3b_d224_4e9d_8e52_e0376b3f215f.slice/crio-76f891d10d0d59ead560ed89de7fd1013bf77bb9a0161ad76a8983f3488058bc WatchSource:0}: Error finding container 76f891d10d0d59ead560ed89de7fd1013bf77bb9a0161ad76a8983f3488058bc: Status 404 returned error can't find the container with id 76f891d10d0d59ead560ed89de7fd1013bf77bb9a0161ad76a8983f3488058bc Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.404995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k57vf" event={"ID":"eb4f6d50-5931-4dec-82ed-606d0a53fb6e","Type":"ContainerStarted","Data":"51d42e8f7874016a02750cc5ed46e5ace9e5c9d2e7df4f0e0842bc73f90e1203"} Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.407347 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6590feb6-7a54-4dcb-8656-60f55d65a5f2","Type":"ContainerStarted","Data":"07d1c04095dbdf8fa8c6658f2e1460ce9d1967c1168bc8c1a6eb0c021501c658"} Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.409261 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"76f891d10d0d59ead560ed89de7fd1013bf77bb9a0161ad76a8983f3488058bc"} Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.414565 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lk79w-config-p69z6" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.414585 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qc8rx" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.414555 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8eea30f2-9f63-4f93-a711-48fee1d631c2","Type":"ContainerDied","Data":"0fbe9df95816c6cff2bb12345300d473a4691a08c6e77ebeb415471f3c6a2955"} Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.414720 4713 scope.go:117] "RemoveContainer" containerID="0840b4f89bc3a9b5375b7ebac143e59dad56ec2d53d97bf9f17d704ab78c944f" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.414911 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.442169 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-k57vf" podStartSLOduration=3.043540299 podStartE2EDuration="19.442145109s" podCreationTimestamp="2026-03-14 05:51:08 +0000 UTC" firstStartedPulling="2026-03-14 05:51:09.950982528 +0000 UTC m=+1453.038891828" lastFinishedPulling="2026-03-14 05:51:26.349587338 +0000 UTC m=+1469.437496638" observedRunningTime="2026-03-14 05:51:27.434490325 +0000 UTC m=+1470.522399635" watchObservedRunningTime="2026-03-14 05:51:27.442145109 +0000 UTC m=+1470.530054419" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.446015 4713 scope.go:117] "RemoveContainer" containerID="9a8cc2ec784962e0c67d3d727ed39a2bc7569f155de51448b671d53d3931e0c1" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.489517 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.490306 4713 scope.go:117] "RemoveContainer" containerID="cf44e192fe4ea925e6d71d88a2d22a947348208ebc497842573b83b56c8f663a" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.503775 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.523709 4713 scope.go:117] "RemoveContainer" containerID="5d86cd1a4eac22c48e3f31d996967f870b5305aea6e524f8a9ae9abe1370021a" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533088 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533618 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="prometheus" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533642 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="prometheus" Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533672 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="init-config-reloader" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533683 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="init-config-reloader" Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="thanos-sidecar" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533714 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="thanos-sidecar" Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533724 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca833874-bd20-45e4-a7cd-4ca26596492a" containerName="ovn-config" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533731 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca833874-bd20-45e4-a7cd-4ca26596492a" containerName="ovn-config" Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533746 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="config-reloader" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533754 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="config-reloader" Mar 14 05:51:27 crc kubenswrapper[4713]: E0314 05:51:27.533771 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1341c453-d963-4a43-a264-0f94dd02b7dd" containerName="swift-ring-rebalance" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.533780 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1341c453-d963-4a43-a264-0f94dd02b7dd" containerName="swift-ring-rebalance" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.534038 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="thanos-sidecar" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.534052 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="prometheus" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.534078 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1341c453-d963-4a43-a264-0f94dd02b7dd" containerName="swift-ring-rebalance" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.534094 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" containerName="config-reloader" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.534104 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca833874-bd20-45e4-a7cd-4ca26596492a" containerName="ovn-config" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.545451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.549546 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.549852 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.549928 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.549974 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.550083 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.550732 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-pfd5j" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.550966 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.551165 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.567117 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.584010 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.641804 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eea30f2-9f63-4f93-a711-48fee1d631c2" path="/var/lib/kubelet/pods/8eea30f2-9f63-4f93-a711-48fee1d631c2/volumes" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.654981 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655092 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655158 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655249 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655461 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655529 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655604 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gz5l\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-kube-api-access-2gz5l\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655699 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655892 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.655945 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.656152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.656375 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.658917 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.713894 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lk79w-config-p69z6"] Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.727145 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lk79w-config-p69z6"] Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761251 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761328 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gz5l\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-kube-api-access-2gz5l\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761426 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761599 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761633 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761940 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.761977 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.762013 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.770640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.771451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.771718 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5e00395b-5b37-4ba4-a4e7-7ad08388b053-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.773021 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.774534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.775612 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.775689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.777556 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.777630 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3ca1519da4cb0118aff35a94efbea77d8a6bbedb0ea01472a96864ab8cceb7b7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.779079 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.789775 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.789864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.797384 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e00395b-5b37-4ba4-a4e7-7ad08388b053-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.819327 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gz5l\" (UniqueName: \"kubernetes.io/projected/5e00395b-5b37-4ba4-a4e7-7ad08388b053-kube-api-access-2gz5l\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.932896 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fe7653e-2622-4864-bb53-7a533f3742d8\") pod \"prometheus-metric-storage-0\" (UID: \"5e00395b-5b37-4ba4-a4e7-7ad08388b053\") " pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:27 crc kubenswrapper[4713]: I0314 05:51:27.976799 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:29 crc kubenswrapper[4713]: I0314 05:51:29.578241 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca833874-bd20-45e4-a7cd-4ca26596492a" path="/var/lib/kubelet/pods/ca833874-bd20-45e4-a7cd-4ca26596492a/volumes" Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.427714 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 05:51:31 crc kubenswrapper[4713]: W0314 05:51:31.430068 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e00395b_5b37_4ba4_a4e7_7ad08388b053.slice/crio-186fb58b9f82f493e1f659a52a38c9115a7ff17b535333559714d86b27d899d7 WatchSource:0}: Error finding container 186fb58b9f82f493e1f659a52a38c9115a7ff17b535333559714d86b27d899d7: Status 404 returned error can't find the container with id 186fb58b9f82f493e1f659a52a38c9115a7ff17b535333559714d86b27d899d7 Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.457181 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerStarted","Data":"186fb58b9f82f493e1f659a52a38c9115a7ff17b535333559714d86b27d899d7"} Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.458924 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"8330e6dc0a0ba68c791c3db475620003434e3a2c551b6378e8d8bc2a20a3402c"} Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.458949 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"e8626bcdde5851fc045daec7d1ffca4e708c85099671453532943a80601b15fe"} Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.460353 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6590feb6-7a54-4dcb-8656-60f55d65a5f2","Type":"ContainerStarted","Data":"4633a409ec3408915579cb9121db32d93190d9d003e26d9e8c2692fcccc85444"} Mar 14 05:51:31 crc kubenswrapper[4713]: I0314 05:51:31.476332 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=11.53479314 podStartE2EDuration="15.476311868s" podCreationTimestamp="2026-03-14 05:51:16 +0000 UTC" firstStartedPulling="2026-03-14 05:51:26.957115389 +0000 UTC m=+1470.045024699" lastFinishedPulling="2026-03-14 05:51:30.898634127 +0000 UTC m=+1473.986543427" observedRunningTime="2026-03-14 05:51:31.473690474 +0000 UTC m=+1474.561599784" watchObservedRunningTime="2026-03-14 05:51:31.476311868 +0000 UTC m=+1474.564221168" Mar 14 05:51:32 crc kubenswrapper[4713]: I0314 05:51:32.471414 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"01d2e69537c0efc01db9b92651405c77332d56034488d517d6b2a630ca9b30ac"} Mar 14 05:51:32 crc kubenswrapper[4713]: I0314 05:51:32.471950 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"f9d158c571c4646f3f7582d4f43e59f608dde7ef66614926f70e47754b6d07a7"} Mar 14 05:51:33 crc kubenswrapper[4713]: I0314 05:51:33.486646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"b92f59ffaaa310daeed61583b9e0de652067ab23a4b35de2019e21dd8c8615e0"} Mar 14 05:51:33 crc kubenswrapper[4713]: I0314 05:51:33.487466 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"a669a4a63944f90f91224ab97a9a2f29fb220a8e0874ffcc6791f363f6c71b9f"} Mar 14 05:51:34 crc kubenswrapper[4713]: I0314 05:51:34.499841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"15a69d296a44c0768cbb98e109c2803e509a9ef343b0259e6fa02578330990aa"} Mar 14 05:51:34 crc kubenswrapper[4713]: I0314 05:51:34.500525 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"2c8a2a0c21d0a5f0bbf456547bf4208b6aa339ef053d3d6099fb5d84ec9e4d69"} Mar 14 05:51:35 crc kubenswrapper[4713]: I0314 05:51:35.662149 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 05:51:35 crc kubenswrapper[4713]: I0314 05:51:35.727797 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.000544 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.352918 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bqmlg"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.354796 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.396296 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bqmlg"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.480170 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.480455 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sb96\" (UniqueName: \"kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.533507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerStarted","Data":"8c86ed247323881d00363741bb0e256c5291e999eaa3b117add46c0aaae147da"} Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.559851 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"45a939aa24afc37e6e8df357ae958a2ae64404780202f761e2c8962dc4c7d5c3"} Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.573618 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c7b9-account-create-update-8bt6b"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.574940 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.580566 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.581995 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.582068 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sb96\" (UniqueName: \"kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.583139 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.611018 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c7b9-account-create-update-8bt6b"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.665261 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sb96\" (UniqueName: \"kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96\") pod \"cinder-db-create-bqmlg\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.684327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzmmh\" (UniqueName: \"kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.684403 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.696315 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-pfctt"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.702724 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.714228 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pfctt"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.782619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.789452 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzmmh\" (UniqueName: \"kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.789517 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxwr9\" (UniqueName: \"kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.789545 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.789570 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.790698 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.820313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzmmh\" (UniqueName: \"kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh\") pod \"cinder-c7b9-account-create-update-8bt6b\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.891175 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-aeae-account-create-update-5rh7j"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.891906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxwr9\" (UniqueName: \"kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.891971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.892524 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.892764 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.897627 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.901829 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.935341 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-aeae-account-create-update-5rh7j"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.947344 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxwr9\" (UniqueName: \"kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9\") pod \"heat-db-create-pfctt\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " pod="openstack/heat-db-create-pfctt" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.994060 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qrf9\" (UniqueName: \"kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.994152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.994343 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5p56m"] Mar 14 05:51:36 crc kubenswrapper[4713]: I0314 05:51:36.995570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.022286 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a7fc-account-create-update-7pwn5"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.023831 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.028511 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.039276 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pfctt" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.040985 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-g2xdv"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.042959 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.051362 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.055126 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.055130 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z87lf" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.055674 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.071307 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5p56m"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.101051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5dh\" (UniqueName: \"kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.101130 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk9nd\" (UniqueName: \"kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.101178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.114516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.114592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.114802 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qrf9\" (UniqueName: \"kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.115655 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.133331 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g2xdv"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.173850 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qrf9\" (UniqueName: \"kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9\") pod \"heat-aeae-account-create-update-5rh7j\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.200279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7fc-account-create-update-7pwn5"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219161 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5dh\" (UniqueName: \"kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219605 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219644 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk9nd\" (UniqueName: \"kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219721 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219747 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219767 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.219783 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.220843 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.221125 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.258087 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5dh\" (UniqueName: \"kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh\") pod \"barbican-a7fc-account-create-update-7pwn5\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.268029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk9nd\" (UniqueName: \"kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd\") pod \"barbican-db-create-5p56m\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.302086 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8cm8b"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.303794 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.322479 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.322535 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.322704 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.338049 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.350068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.358999 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv\") pod \"keystone-db-sync-g2xdv\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.377072 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.399955 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8cm8b"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.400438 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.410276 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-da85-account-create-update-dnlj4"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.412019 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.417821 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.424293 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djs5\" (UniqueName: \"kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.424389 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.475090 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da85-account-create-update-dnlj4"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.531575 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.531733 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.531902 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djs5\" (UniqueName: \"kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.531950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zrfw\" (UniqueName: \"kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.533021 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.547463 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.565803 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djs5\" (UniqueName: \"kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5\") pod \"neutron-db-create-8cm8b\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.636573 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zrfw\" (UniqueName: \"kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.636728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.637839 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.667930 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"116f671a740796f010c97ac0d760c134e4669c4f7ba05ef763beaa45b6cf6da5"} Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.667991 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"d03b86a875e0e1c77ee7a9ea7f0ee7f4cae01b34d4840441532bd5dd22736860"} Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.675312 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zrfw\" (UniqueName: \"kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw\") pod \"neutron-da85-account-create-update-dnlj4\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.737338 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.755809 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.896187 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c7b9-account-create-update-8bt6b"] Mar 14 05:51:37 crc kubenswrapper[4713]: I0314 05:51:37.912982 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bqmlg"] Mar 14 05:51:38 crc kubenswrapper[4713]: W0314 05:51:38.449570 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod897c3cbc_a12b_486f_887e_59b4d6e37f42.slice/crio-c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf WatchSource:0}: Error finding container c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf: Status 404 returned error can't find the container with id c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.550257 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pfctt"] Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.720549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"bd5b3670aa6ef62141bd866a2136de9d28163ddb6d8918e017e2469678b3b72f"} Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.735285 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7b9-account-create-update-8bt6b" event={"ID":"897c3cbc-a12b-486f-887e-59b4d6e37f42","Type":"ContainerStarted","Data":"c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf"} Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.740736 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-aeae-account-create-update-5rh7j"] Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.742415 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqmlg" event={"ID":"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427","Type":"ContainerStarted","Data":"393c272774747328f65ce3cd795fe3e5daa2379d08113e68d812b107559969d5"} Mar 14 05:51:38 crc kubenswrapper[4713]: I0314 05:51:38.756056 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pfctt" event={"ID":"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd","Type":"ContainerStarted","Data":"bfc9ae16180a66534dfcfb5f798364f698556ca8d55aaeee5356d34e4d0a7523"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.027370 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g2xdv"] Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.053280 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5p56m"] Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.066764 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7fc-account-create-update-7pwn5"] Mar 14 05:51:39 crc kubenswrapper[4713]: W0314 05:51:39.078868 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf90c93b_5f4e_41c0_8a65_1a480062a11f.slice/crio-a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63 WatchSource:0}: Error finding container a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63: Status 404 returned error can't find the container with id a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.261220 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da85-account-create-update-dnlj4"] Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.290775 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8cm8b"] Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.771099 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7fc-account-create-update-7pwn5" event={"ID":"af90c93b-5f4e-41c0-8a65-1a480062a11f","Type":"ContainerStarted","Data":"a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.781852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"a00945b6340e8af187f3cfe1300a61f0eaf10790f33c39ba1e9eb5640281b8e3"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.785589 4713 generic.go:334] "Generic (PLEG): container finished" podID="897c3cbc-a12b-486f-887e-59b4d6e37f42" containerID="fb2f44f477d8431377ea7d4e65887deb9c1c517a62285744a683a9967a40bb03" exitCode=0 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.785740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7b9-account-create-update-8bt6b" event={"ID":"897c3cbc-a12b-486f-887e-59b4d6e37f42","Type":"ContainerDied","Data":"fb2f44f477d8431377ea7d4e65887deb9c1c517a62285744a683a9967a40bb03"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.788120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g2xdv" event={"ID":"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9","Type":"ContainerStarted","Data":"0c0723632e7634fd096451485c52a53e1b40397f216d44bbf162ccd8fa6e1a4a"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.790166 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da85-account-create-update-dnlj4" event={"ID":"166c7fc9-cfdb-448b-8a3e-2915689f014e","Type":"ContainerStarted","Data":"a89c7e09c8b43b108f8d8bc09e23d191cbf9911864cee53bd479c759b2171044"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.793190 4713 generic.go:334] "Generic (PLEG): container finished" podID="6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" containerID="89db566da35dcb73d2dc6c608c23426d2ac6e7da9f7d335441285fb319f440f4" exitCode=0 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.793261 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqmlg" event={"ID":"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427","Type":"ContainerDied","Data":"89db566da35dcb73d2dc6c608c23426d2ac6e7da9f7d335441285fb319f440f4"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.795866 4713 generic.go:334] "Generic (PLEG): container finished" podID="eb4f6d50-5931-4dec-82ed-606d0a53fb6e" containerID="51d42e8f7874016a02750cc5ed46e5ace9e5c9d2e7df4f0e0842bc73f90e1203" exitCode=0 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.795966 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k57vf" event={"ID":"eb4f6d50-5931-4dec-82ed-606d0a53fb6e","Type":"ContainerDied","Data":"51d42e8f7874016a02750cc5ed46e5ace9e5c9d2e7df4f0e0842bc73f90e1203"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.797642 4713 generic.go:334] "Generic (PLEG): container finished" podID="b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" containerID="3effbde9bf7f8bee2421c71f66aec38766c89897bd8eb23f9fe9af65fd35b541" exitCode=0 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.797685 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pfctt" event={"ID":"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd","Type":"ContainerDied","Data":"3effbde9bf7f8bee2421c71f66aec38766c89897bd8eb23f9fe9af65fd35b541"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.799407 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5p56m" event={"ID":"279d8812-ca1b-4f1e-a094-072076726e8c","Type":"ContainerStarted","Data":"59fb2c30fe25ab84c1554fc2b757ba4fdc8d8b5d6be9bab47cee3007a5624d3e"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.801735 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-aeae-account-create-update-5rh7j" event={"ID":"02460855-3f34-4eeb-b287-3b2a5fb94d89","Type":"ContainerDied","Data":"307cdc6c35584efdf33e740cd56d094a673973aebf34901985ccf490976e227f"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.801353 4713 generic.go:334] "Generic (PLEG): container finished" podID="02460855-3f34-4eeb-b287-3b2a5fb94d89" containerID="307cdc6c35584efdf33e740cd56d094a673973aebf34901985ccf490976e227f" exitCode=0 Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.804374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-aeae-account-create-update-5rh7j" event={"ID":"02460855-3f34-4eeb-b287-3b2a5fb94d89","Type":"ContainerStarted","Data":"22dec711cdcd95909a493f2e97e9ee81b5dbf4fbb4009d2ac3d95b6127f51cca"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.806335 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8cm8b" event={"ID":"bda8f62b-2f71-482e-8189-ce9ec768da83","Type":"ContainerStarted","Data":"f754cd3b6e2b126dc29bcc10f16834ae7152743296ea991b59a3d120205bdfed"} Mar 14 05:51:39 crc kubenswrapper[4713]: I0314 05:51:39.826401 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-5p56m" podStartSLOduration=3.826380559 podStartE2EDuration="3.826380559s" podCreationTimestamp="2026-03-14 05:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:39.819075957 +0000 UTC m=+1482.906985257" watchObservedRunningTime="2026-03-14 05:51:39.826380559 +0000 UTC m=+1482.914289859" Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.731864 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.732200 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.827634 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"39274ccafc039b9b3c47fabe209da4527aee000ad91b9d233d585b877c83913e"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.827675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80f03c3b-d224-4e9d-8e52-e0376b3f215f","Type":"ContainerStarted","Data":"0c5d15c07795faf0226cd576234fb6d5e38f6008b0e1b6cd3066635995579852"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.836342 4713 generic.go:334] "Generic (PLEG): container finished" podID="279d8812-ca1b-4f1e-a094-072076726e8c" containerID="f5b7814e09ece1d77110d47ef200c49c637eac9a9ab47d2f394126768a7a087f" exitCode=0 Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.836648 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5p56m" event={"ID":"279d8812-ca1b-4f1e-a094-072076726e8c","Type":"ContainerDied","Data":"f5b7814e09ece1d77110d47ef200c49c637eac9a9ab47d2f394126768a7a087f"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.841071 4713 generic.go:334] "Generic (PLEG): container finished" podID="bda8f62b-2f71-482e-8189-ce9ec768da83" containerID="1a44d6756bdab94bc8e566d791d5907f49904cdbd29506a3b8e8519aa3086522" exitCode=0 Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.841261 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8cm8b" event={"ID":"bda8f62b-2f71-482e-8189-ce9ec768da83","Type":"ContainerDied","Data":"1a44d6756bdab94bc8e566d791d5907f49904cdbd29506a3b8e8519aa3086522"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.845638 4713 generic.go:334] "Generic (PLEG): container finished" podID="166c7fc9-cfdb-448b-8a3e-2915689f014e" containerID="bb77a49d7913dd4497d606f3761b268321a685306a8c9b887a32cff4cab5b47b" exitCode=0 Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.845798 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da85-account-create-update-dnlj4" event={"ID":"166c7fc9-cfdb-448b-8a3e-2915689f014e","Type":"ContainerDied","Data":"bb77a49d7913dd4497d606f3761b268321a685306a8c9b887a32cff4cab5b47b"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.850426 4713 generic.go:334] "Generic (PLEG): container finished" podID="af90c93b-5f4e-41c0-8a65-1a480062a11f" containerID="b73163b5d670b2cff92d2108ac79295448b493456a92cdb9d3fdbf4f96518c09" exitCode=0 Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.850737 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7fc-account-create-update-7pwn5" event={"ID":"af90c93b-5f4e-41c0-8a65-1a480062a11f","Type":"ContainerDied","Data":"b73163b5d670b2cff92d2108ac79295448b493456a92cdb9d3fdbf4f96518c09"} Mar 14 05:51:40 crc kubenswrapper[4713]: I0314 05:51:40.873101 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.013381726 podStartE2EDuration="50.8730597s" podCreationTimestamp="2026-03-14 05:50:50 +0000 UTC" firstStartedPulling="2026-03-14 05:51:27.166389696 +0000 UTC m=+1470.254298996" lastFinishedPulling="2026-03-14 05:51:36.02606767 +0000 UTC m=+1479.113976970" observedRunningTime="2026-03-14 05:51:40.865915042 +0000 UTC m=+1483.953824362" watchObservedRunningTime="2026-03-14 05:51:40.8730597 +0000 UTC m=+1483.960969000" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.176879 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.184421 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.189298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.199516 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249027 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249102 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249175 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249313 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249388 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92n58\" (UniqueName: \"kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.249432 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351232 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351419 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92n58\" (UniqueName: \"kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.351443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.352529 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.352587 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.352871 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.353158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.353418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.373905 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92n58\" (UniqueName: \"kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58\") pod \"dnsmasq-dns-5c79d794d7-hk97k\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.516682 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.530109 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.646553 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.658566 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.664312 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts\") pod \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.665469 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sb96\" (UniqueName: \"kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96\") pod \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\" (UID: \"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.677457 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" (UID: "6f9b1fa4-a9bd-434e-aee8-3ca38abfd427"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.681090 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96" (OuterVolumeSpecName: "kube-api-access-4sb96") pod "6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" (UID: "6f9b1fa4-a9bd-434e-aee8-3ca38abfd427"). InnerVolumeSpecName "kube-api-access-4sb96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.683648 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.709956 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pfctt" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.767688 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts\") pod \"897c3cbc-a12b-486f-887e-59b4d6e37f42\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.767729 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzmmh\" (UniqueName: \"kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh\") pod \"897c3cbc-a12b-486f-887e-59b4d6e37f42\" (UID: \"897c3cbc-a12b-486f-887e-59b4d6e37f42\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.767860 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts\") pod \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.768335 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" (UID: "b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.768783 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "897c3cbc-a12b-486f-887e-59b4d6e37f42" (UID: "897c3cbc-a12b-486f-887e-59b4d6e37f42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.770999 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxwr9\" (UniqueName: \"kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9\") pod \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\" (UID: \"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data\") pod \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771113 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts\") pod \"02460855-3f34-4eeb-b287-3b2a5fb94d89\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771115 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh" (OuterVolumeSpecName: "kube-api-access-qzmmh") pod "897c3cbc-a12b-486f-887e-59b4d6e37f42" (UID: "897c3cbc-a12b-486f-887e-59b4d6e37f42"). InnerVolumeSpecName "kube-api-access-qzmmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771139 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data\") pod \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771157 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle\") pod \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mcsx\" (UniqueName: \"kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx\") pod \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\" (UID: \"eb4f6d50-5931-4dec-82ed-606d0a53fb6e\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771255 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qrf9\" (UniqueName: \"kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9\") pod \"02460855-3f34-4eeb-b287-3b2a5fb94d89\" (UID: \"02460855-3f34-4eeb-b287-3b2a5fb94d89\") " Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.771512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02460855-3f34-4eeb-b287-3b2a5fb94d89" (UID: "02460855-3f34-4eeb-b287-3b2a5fb94d89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772406 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772425 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sb96\" (UniqueName: \"kubernetes.io/projected/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-kube-api-access-4sb96\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772436 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02460855-3f34-4eeb-b287-3b2a5fb94d89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772447 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/897c3cbc-a12b-486f-887e-59b4d6e37f42-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772456 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzmmh\" (UniqueName: \"kubernetes.io/projected/897c3cbc-a12b-486f-887e-59b4d6e37f42-kube-api-access-qzmmh\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.772466 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.775122 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9" (OuterVolumeSpecName: "kube-api-access-fxwr9") pod "b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" (UID: "b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd"). InnerVolumeSpecName "kube-api-access-fxwr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.775168 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9" (OuterVolumeSpecName: "kube-api-access-8qrf9") pod "02460855-3f34-4eeb-b287-3b2a5fb94d89" (UID: "02460855-3f34-4eeb-b287-3b2a5fb94d89"). InnerVolumeSpecName "kube-api-access-8qrf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.775630 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eb4f6d50-5931-4dec-82ed-606d0a53fb6e" (UID: "eb4f6d50-5931-4dec-82ed-606d0a53fb6e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.781293 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx" (OuterVolumeSpecName: "kube-api-access-8mcsx") pod "eb4f6d50-5931-4dec-82ed-606d0a53fb6e" (UID: "eb4f6d50-5931-4dec-82ed-606d0a53fb6e"). InnerVolumeSpecName "kube-api-access-8mcsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.802566 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb4f6d50-5931-4dec-82ed-606d0a53fb6e" (UID: "eb4f6d50-5931-4dec-82ed-606d0a53fb6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.874893 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxwr9\" (UniqueName: \"kubernetes.io/projected/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd-kube-api-access-fxwr9\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.874935 4713 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.874947 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.874958 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mcsx\" (UniqueName: \"kubernetes.io/projected/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-kube-api-access-8mcsx\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.874971 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qrf9\" (UniqueName: \"kubernetes.io/projected/02460855-3f34-4eeb-b287-3b2a5fb94d89-kube-api-access-8qrf9\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.877362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data" (OuterVolumeSpecName: "config-data") pod "eb4f6d50-5931-4dec-82ed-606d0a53fb6e" (UID: "eb4f6d50-5931-4dec-82ed-606d0a53fb6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.886694 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pfctt" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.886690 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pfctt" event={"ID":"b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd","Type":"ContainerDied","Data":"bfc9ae16180a66534dfcfb5f798364f698556ca8d55aaeee5356d34e4d0a7523"} Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.886856 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc9ae16180a66534dfcfb5f798364f698556ca8d55aaeee5356d34e4d0a7523" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.899133 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7b9-account-create-update-8bt6b" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.899135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7b9-account-create-update-8bt6b" event={"ID":"897c3cbc-a12b-486f-887e-59b4d6e37f42","Type":"ContainerDied","Data":"c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf"} Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.899246 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9f79a2283a060ab7bf3133a78909ca175caa09276483aa5106a10c6df137cdf" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.900851 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-aeae-account-create-update-5rh7j" event={"ID":"02460855-3f34-4eeb-b287-3b2a5fb94d89","Type":"ContainerDied","Data":"22dec711cdcd95909a493f2e97e9ee81b5dbf4fbb4009d2ac3d95b6127f51cca"} Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.900868 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22dec711cdcd95909a493f2e97e9ee81b5dbf4fbb4009d2ac3d95b6127f51cca" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.900920 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-aeae-account-create-update-5rh7j" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.922037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bqmlg" event={"ID":"6f9b1fa4-a9bd-434e-aee8-3ca38abfd427","Type":"ContainerDied","Data":"393c272774747328f65ce3cd795fe3e5daa2379d08113e68d812b107559969d5"} Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.922074 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="393c272774747328f65ce3cd795fe3e5daa2379d08113e68d812b107559969d5" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.922167 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bqmlg" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.939279 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k57vf" event={"ID":"eb4f6d50-5931-4dec-82ed-606d0a53fb6e","Type":"ContainerDied","Data":"cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2"} Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.939312 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cba5d3df262fe8641516e9b6408a97a1aea24b9b56bcc9ee3d379effeff00af2" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.940392 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k57vf" Mar 14 05:51:41 crc kubenswrapper[4713]: I0314 05:51:41.977189 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f6d50-5931-4dec-82ed-606d0a53fb6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.186317 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.200776 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.244353 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:51:42 crc kubenswrapper[4713]: E0314 05:51:42.244997 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02460855-3f34-4eeb-b287-3b2a5fb94d89" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245022 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="02460855-3f34-4eeb-b287-3b2a5fb94d89" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: E0314 05:51:42.245043 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f6d50-5931-4dec-82ed-606d0a53fb6e" containerName="glance-db-sync" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245049 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f6d50-5931-4dec-82ed-606d0a53fb6e" containerName="glance-db-sync" Mar 14 05:51:42 crc kubenswrapper[4713]: E0314 05:51:42.245061 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245068 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: E0314 05:51:42.245076 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897c3cbc-a12b-486f-887e-59b4d6e37f42" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245086 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="897c3cbc-a12b-486f-887e-59b4d6e37f42" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: E0314 05:51:42.245101 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245107 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245398 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="897c3cbc-a12b-486f-887e-59b4d6e37f42" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245420 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245435 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f6d50-5931-4dec-82ed-606d0a53fb6e" containerName="glance-db-sync" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245454 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="02460855-3f34-4eeb-b287-3b2a5fb94d89" containerName="mariadb-account-create-update" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.245468 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" containerName="mariadb-database-create" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.247714 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.249548 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.289920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.290012 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkr4\" (UniqueName: \"kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.290048 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.290120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.290148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.290226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392079 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392162 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkr4\" (UniqueName: \"kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392347 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.392370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.393618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.393841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.393916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.394087 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.394696 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.417892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkr4\" (UniqueName: \"kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4\") pod \"dnsmasq-dns-5f59b8f679-pzmjg\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.572445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.952622 4713 generic.go:334] "Generic (PLEG): container finished" podID="5e00395b-5b37-4ba4-a4e7-7ad08388b053" containerID="8c86ed247323881d00363741bb0e256c5291e999eaa3b117add46c0aaae147da" exitCode=0 Mar 14 05:51:42 crc kubenswrapper[4713]: I0314 05:51:42.952680 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerDied","Data":"8c86ed247323881d00363741bb0e256c5291e999eaa3b117add46c0aaae147da"} Mar 14 05:51:45 crc kubenswrapper[4713]: W0314 05:51:45.755489 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e2d515_79e3_4d8b_8c56_49c49d35cb2a.slice/crio-03fa37e07a6127f34628b3ded89dfa23d6529782d15865b2ae2ba3e5936e3787 WatchSource:0}: Error finding container 03fa37e07a6127f34628b3ded89dfa23d6529782d15865b2ae2ba3e5936e3787: Status 404 returned error can't find the container with id 03fa37e07a6127f34628b3ded89dfa23d6529782d15865b2ae2ba3e5936e3787 Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.901794 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.904969 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.927266 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.987292 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.987353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbmp\" (UniqueName: \"kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:45 crc kubenswrapper[4713]: I0314 05:51:45.987505 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.015525 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.017107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8cm8b" event={"ID":"bda8f62b-2f71-482e-8189-ce9ec768da83","Type":"ContainerDied","Data":"f754cd3b6e2b126dc29bcc10f16834ae7152743296ea991b59a3d120205bdfed"} Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.017146 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f754cd3b6e2b126dc29bcc10f16834ae7152743296ea991b59a3d120205bdfed" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.019002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da85-account-create-update-dnlj4" event={"ID":"166c7fc9-cfdb-448b-8a3e-2915689f014e","Type":"ContainerDied","Data":"a89c7e09c8b43b108f8d8bc09e23d191cbf9911864cee53bd479c759b2171044"} Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.019028 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89c7e09c8b43b108f8d8bc09e23d191cbf9911864cee53bd479c759b2171044" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.019154 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da85-account-create-update-dnlj4" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.020879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7fc-account-create-update-7pwn5" event={"ID":"af90c93b-5f4e-41c0-8a65-1a480062a11f","Type":"ContainerDied","Data":"a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63"} Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.021018 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51e97a035f65eb9a762078d4f08fa8c4050bc82e882cb69aea5711d44b26a63" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.021999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" event={"ID":"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a","Type":"ContainerStarted","Data":"03fa37e07a6127f34628b3ded89dfa23d6529782d15865b2ae2ba3e5936e3787"} Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.023122 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5p56m" event={"ID":"279d8812-ca1b-4f1e-a094-072076726e8c","Type":"ContainerDied","Data":"59fb2c30fe25ab84c1554fc2b757ba4fdc8d8b5d6be9bab47cee3007a5624d3e"} Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.023238 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59fb2c30fe25ab84c1554fc2b757ba4fdc8d8b5d6be9bab47cee3007a5624d3e" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.090016 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zrfw\" (UniqueName: \"kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw\") pod \"166c7fc9-cfdb-448b-8a3e-2915689f014e\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.090087 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts\") pod \"166c7fc9-cfdb-448b-8a3e-2915689f014e\" (UID: \"166c7fc9-cfdb-448b-8a3e-2915689f014e\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.090511 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.090558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbmp\" (UniqueName: \"kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.090748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.091332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.091632 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.092484 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "166c7fc9-cfdb-448b-8a3e-2915689f014e" (UID: "166c7fc9-cfdb-448b-8a3e-2915689f014e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.095564 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.113162 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw" (OuterVolumeSpecName: "kube-api-access-2zrfw") pod "166c7fc9-cfdb-448b-8a3e-2915689f014e" (UID: "166c7fc9-cfdb-448b-8a3e-2915689f014e"). InnerVolumeSpecName "kube-api-access-2zrfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.120935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbmp\" (UniqueName: \"kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp\") pod \"redhat-operators-fps7b\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.174785 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.175464 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192383 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts\") pod \"af90c93b-5f4e-41c0-8a65-1a480062a11f\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192475 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9djs5\" (UniqueName: \"kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5\") pod \"bda8f62b-2f71-482e-8189-ce9ec768da83\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192509 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts\") pod \"bda8f62b-2f71-482e-8189-ce9ec768da83\" (UID: \"bda8f62b-2f71-482e-8189-ce9ec768da83\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192735 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc5dh\" (UniqueName: \"kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh\") pod \"af90c93b-5f4e-41c0-8a65-1a480062a11f\" (UID: \"af90c93b-5f4e-41c0-8a65-1a480062a11f\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192799 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts\") pod \"279d8812-ca1b-4f1e-a094-072076726e8c\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192856 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk9nd\" (UniqueName: \"kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd\") pod \"279d8812-ca1b-4f1e-a094-072076726e8c\" (UID: \"279d8812-ca1b-4f1e-a094-072076726e8c\") " Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.192892 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af90c93b-5f4e-41c0-8a65-1a480062a11f" (UID: "af90c93b-5f4e-41c0-8a65-1a480062a11f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.193159 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bda8f62b-2f71-482e-8189-ce9ec768da83" (UID: "bda8f62b-2f71-482e-8189-ce9ec768da83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.194194 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "279d8812-ca1b-4f1e-a094-072076726e8c" (UID: "279d8812-ca1b-4f1e-a094-072076726e8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.195809 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af90c93b-5f4e-41c0-8a65-1a480062a11f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.195836 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8f62b-2f71-482e-8189-ce9ec768da83-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.195850 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/279d8812-ca1b-4f1e-a094-072076726e8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.195866 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zrfw\" (UniqueName: \"kubernetes.io/projected/166c7fc9-cfdb-448b-8a3e-2915689f014e-kube-api-access-2zrfw\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.195904 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166c7fc9-cfdb-448b-8a3e-2915689f014e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.197731 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5" (OuterVolumeSpecName: "kube-api-access-9djs5") pod "bda8f62b-2f71-482e-8189-ce9ec768da83" (UID: "bda8f62b-2f71-482e-8189-ce9ec768da83"). InnerVolumeSpecName "kube-api-access-9djs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.199181 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh" (OuterVolumeSpecName: "kube-api-access-qc5dh") pod "af90c93b-5f4e-41c0-8a65-1a480062a11f" (UID: "af90c93b-5f4e-41c0-8a65-1a480062a11f"). InnerVolumeSpecName "kube-api-access-qc5dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.211876 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd" (OuterVolumeSpecName: "kube-api-access-qk9nd") pod "279d8812-ca1b-4f1e-a094-072076726e8c" (UID: "279d8812-ca1b-4f1e-a094-072076726e8c"). InnerVolumeSpecName "kube-api-access-qk9nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.213993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.302258 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9djs5\" (UniqueName: \"kubernetes.io/projected/bda8f62b-2f71-482e-8189-ce9ec768da83-kube-api-access-9djs5\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.302291 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc5dh\" (UniqueName: \"kubernetes.io/projected/af90c93b-5f4e-41c0-8a65-1a480062a11f-kube-api-access-qc5dh\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.302342 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk9nd\" (UniqueName: \"kubernetes.io/projected/279d8812-ca1b-4f1e-a094-072076726e8c-kube-api-access-qk9nd\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:46 crc kubenswrapper[4713]: I0314 05:51:46.595864 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.009158 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.051289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerStarted","Data":"c2f23e3e289e71c05dee5dc0ff42cf32ee2482deeae970fc19f45cf4086e4ba4"} Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.056288 4713 generic.go:334] "Generic (PLEG): container finished" podID="b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" containerID="ec1c436c6aa1b6705061400629a91903cad6f0e5baed10fb54d8ace75b8fa488" exitCode=0 Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.056371 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" event={"ID":"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a","Type":"ContainerDied","Data":"ec1c436c6aa1b6705061400629a91903cad6f0e5baed10fb54d8ace75b8fa488"} Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.079453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerStarted","Data":"9d06c69dd8fb0fbfd4b78ff31c639d50b47dea0d5cd10b666d48cea088613c9c"} Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.082763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g2xdv" event={"ID":"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9","Type":"ContainerStarted","Data":"71ae1a7c8b695285fb54b047398b7eb8afbf227c61ced8b82fbdf8e02c301614"} Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.089891 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7fc-account-create-update-7pwn5" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.092375 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8cm8b" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.097340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" event={"ID":"c25cc920-5615-46b7-bca6-c0614071eddd","Type":"ContainerStarted","Data":"a04dab1981cab62232122db337ad18a3f2c2eb2353dc44d4e8894f896fbc7223"} Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.097436 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5p56m" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.130992 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-g2xdv" podStartSLOduration=4.253794075 podStartE2EDuration="11.130975082s" podCreationTimestamp="2026-03-14 05:51:36 +0000 UTC" firstStartedPulling="2026-03-14 05:51:39.103323679 +0000 UTC m=+1482.191232979" lastFinishedPulling="2026-03-14 05:51:45.980504676 +0000 UTC m=+1489.068413986" observedRunningTime="2026-03-14 05:51:47.129720472 +0000 UTC m=+1490.217629772" watchObservedRunningTime="2026-03-14 05:51:47.130975082 +0000 UTC m=+1490.218884382" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.632034 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:47 crc kubenswrapper[4713]: E0314 05:51:47.685507 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c20b670_fc3b_4cb5_b5eb_dd15e5bb9bf6.slice/crio-7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.688602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.688817 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92n58\" (UniqueName: \"kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.688946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.689057 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.689095 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.689172 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0\") pod \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\" (UID: \"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a\") " Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.698750 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58" (OuterVolumeSpecName: "kube-api-access-92n58") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "kube-api-access-92n58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.721169 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.726732 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.734899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config" (OuterVolumeSpecName: "config") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.735981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.743715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" (UID: "b6e2d515-79e3-4d8b-8c56-49c49d35cb2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792594 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792642 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792654 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92n58\" (UniqueName: \"kubernetes.io/projected/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-kube-api-access-92n58\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792670 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792685 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:47 crc kubenswrapper[4713]: I0314 05:51:47.792698 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.105015 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerID="7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce" exitCode=0 Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.105088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerDied","Data":"7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce"} Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.116403 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" event={"ID":"b6e2d515-79e3-4d8b-8c56-49c49d35cb2a","Type":"ContainerDied","Data":"03fa37e07a6127f34628b3ded89dfa23d6529782d15865b2ae2ba3e5936e3787"} Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.116444 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-hk97k" Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.116458 4713 scope.go:117] "RemoveContainer" containerID="ec1c436c6aa1b6705061400629a91903cad6f0e5baed10fb54d8ace75b8fa488" Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.127299 4713 generic.go:334] "Generic (PLEG): container finished" podID="c25cc920-5615-46b7-bca6-c0614071eddd" containerID="94b325cf3c820313394333cc4fe7ca687bb9e3aa3432d7c2f037608d31f3502c" exitCode=0 Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.127763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" event={"ID":"c25cc920-5615-46b7-bca6-c0614071eddd","Type":"ContainerDied","Data":"94b325cf3c820313394333cc4fe7ca687bb9e3aa3432d7c2f037608d31f3502c"} Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.434443 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:48 crc kubenswrapper[4713]: I0314 05:51:48.455487 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-hk97k"] Mar 14 05:51:49 crc kubenswrapper[4713]: I0314 05:51:49.150565 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" event={"ID":"c25cc920-5615-46b7-bca6-c0614071eddd","Type":"ContainerStarted","Data":"7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4"} Mar 14 05:51:49 crc kubenswrapper[4713]: I0314 05:51:49.151350 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:49 crc kubenswrapper[4713]: I0314 05:51:49.184617 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" podStartSLOduration=7.184597706 podStartE2EDuration="7.184597706s" podCreationTimestamp="2026-03-14 05:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:49.174449163 +0000 UTC m=+1492.262358483" watchObservedRunningTime="2026-03-14 05:51:49.184597706 +0000 UTC m=+1492.272507006" Mar 14 05:51:49 crc kubenswrapper[4713]: I0314 05:51:49.576779 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" path="/var/lib/kubelet/pods/b6e2d515-79e3-4d8b-8c56-49c49d35cb2a/volumes" Mar 14 05:51:50 crc kubenswrapper[4713]: I0314 05:51:50.166936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerStarted","Data":"185364ad22431baa943d8e79da9e3c73aa4c00cf4ec976c274934c622cc36d79"} Mar 14 05:51:50 crc kubenswrapper[4713]: I0314 05:51:50.170157 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerStarted","Data":"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17"} Mar 14 05:51:51 crc kubenswrapper[4713]: I0314 05:51:51.184700 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e00395b-5b37-4ba4-a4e7-7ad08388b053","Type":"ContainerStarted","Data":"92b255515857c778af5d7b0f6dee702e478ad7a194eb9a0e9ab90bcd0b0d8caf"} Mar 14 05:51:51 crc kubenswrapper[4713]: I0314 05:51:51.240746 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.240724189 podStartE2EDuration="24.240724189s" podCreationTimestamp="2026-03-14 05:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:51:51.232322491 +0000 UTC m=+1494.320231811" watchObservedRunningTime="2026-03-14 05:51:51.240724189 +0000 UTC m=+1494.328633499" Mar 14 05:51:52 crc kubenswrapper[4713]: I0314 05:51:52.977676 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:53 crc kubenswrapper[4713]: I0314 05:51:53.223882 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerID="f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17" exitCode=0 Mar 14 05:51:53 crc kubenswrapper[4713]: I0314 05:51:53.224193 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerDied","Data":"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17"} Mar 14 05:51:54 crc kubenswrapper[4713]: I0314 05:51:54.237755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerStarted","Data":"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a"} Mar 14 05:51:54 crc kubenswrapper[4713]: I0314 05:51:54.259961 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fps7b" podStartSLOduration=3.658537709 podStartE2EDuration="9.25994323s" podCreationTimestamp="2026-03-14 05:51:45 +0000 UTC" firstStartedPulling="2026-03-14 05:51:48.110349388 +0000 UTC m=+1491.198258688" lastFinishedPulling="2026-03-14 05:51:53.711754889 +0000 UTC m=+1496.799664209" observedRunningTime="2026-03-14 05:51:54.255122735 +0000 UTC m=+1497.343032035" watchObservedRunningTime="2026-03-14 05:51:54.25994323 +0000 UTC m=+1497.347852530" Mar 14 05:51:55 crc kubenswrapper[4713]: I0314 05:51:55.249113 4713 generic.go:334] "Generic (PLEG): container finished" podID="b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" containerID="71ae1a7c8b695285fb54b047398b7eb8afbf227c61ced8b82fbdf8e02c301614" exitCode=0 Mar 14 05:51:55 crc kubenswrapper[4713]: I0314 05:51:55.249305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g2xdv" event={"ID":"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9","Type":"ContainerDied","Data":"71ae1a7c8b695285fb54b047398b7eb8afbf227c61ced8b82fbdf8e02c301614"} Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.215115 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.215164 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.658806 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.805977 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv\") pod \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.806559 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data\") pod \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.806684 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle\") pod \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\" (UID: \"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9\") " Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.812751 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv" (OuterVolumeSpecName: "kube-api-access-mjcfv") pod "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" (UID: "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9"). InnerVolumeSpecName "kube-api-access-mjcfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.841384 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" (UID: "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.861621 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data" (OuterVolumeSpecName: "config-data") pod "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" (UID: "b9f0d8e7-9d94-46ae-a721-d4557e09a0e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.909468 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.909502 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:56 crc kubenswrapper[4713]: I0314 05:51:56.909513 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9-kube-api-access-mjcfv\") on node \"crc\" DevicePath \"\"" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.268616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g2xdv" event={"ID":"b9f0d8e7-9d94-46ae-a721-d4557e09a0e9","Type":"ContainerDied","Data":"0c0723632e7634fd096451485c52a53e1b40397f216d44bbf162ccd8fa6e1a4a"} Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.268656 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0723632e7634fd096451485c52a53e1b40397f216d44bbf162ccd8fa6e1a4a" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.268719 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g2xdv" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.270818 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:51:57 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:51:57 crc kubenswrapper[4713]: > Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.497399 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dcls9"] Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498045 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166c7fc9-cfdb-448b-8a3e-2915689f014e" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498068 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="166c7fc9-cfdb-448b-8a3e-2915689f014e" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498082 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279d8812-ca1b-4f1e-a094-072076726e8c" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498091 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="279d8812-ca1b-4f1e-a094-072076726e8c" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498100 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda8f62b-2f71-482e-8189-ce9ec768da83" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498107 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda8f62b-2f71-482e-8189-ce9ec768da83" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498122 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" containerName="init" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498127 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" containerName="init" Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498135 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" containerName="keystone-db-sync" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498141 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" containerName="keystone-db-sync" Mar 14 05:51:57 crc kubenswrapper[4713]: E0314 05:51:57.498156 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af90c93b-5f4e-41c0-8a65-1a480062a11f" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498162 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af90c93b-5f4e-41c0-8a65-1a480062a11f" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498410 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="279d8812-ca1b-4f1e-a094-072076726e8c" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498437 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e2d515-79e3-4d8b-8c56-49c49d35cb2a" containerName="init" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498458 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" containerName="keystone-db-sync" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498475 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af90c93b-5f4e-41c0-8a65-1a480062a11f" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498489 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="166c7fc9-cfdb-448b-8a3e-2915689f014e" containerName="mariadb-account-create-update" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.498515 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda8f62b-2f71-482e-8189-ce9ec768da83" containerName="mariadb-database-create" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.499543 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.505912 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.506162 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z87lf" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.506385 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.507072 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.508302 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.508575 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="dnsmasq-dns" containerID="cri-o://7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4" gracePeriod=10 Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.508967 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.510378 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.529592 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dcls9"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.550629 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.552955 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.626894 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.627780 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.627879 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htspb\" (UniqueName: \"kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.627908 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.627936 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.628023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.682692 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.731977 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732404 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzm4\" (UniqueName: \"kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732462 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htspb\" (UniqueName: \"kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732618 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732733 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732785 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.732914 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.752325 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.755546 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.758060 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-xrnbs"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.766211 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.770980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8tk8h" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.771248 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.772846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htspb\" (UniqueName: \"kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.782301 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xrnbs"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.783995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.784141 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.785868 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle\") pod \"keystone-bootstrap-dcls9\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.835687 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.836007 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzm4\" (UniqueName: \"kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.836136 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.836304 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.836414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.836514 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.837802 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.838549 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.839275 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.840354 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.840224 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.848396 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.937223 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzm4\" (UniqueName: \"kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4\") pod \"dnsmasq-dns-bbf5cc879-qzpck\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.938733 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm8bs\" (UniqueName: \"kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.938814 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.938856 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.954217 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-r6jzk"] Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.955919 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.958700 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.975745 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.976762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-28qsr" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.976988 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 05:51:57 crc kubenswrapper[4713]: I0314 05:51:57.977792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.020334 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-r6jzk"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.031472 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.042621 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-l62vj"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.042934 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjz2\" (UniqueName: \"kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043528 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm8bs\" (UniqueName: \"kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043782 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.043948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.044059 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.044176 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.044304 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.072803 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xzbbk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.073042 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.084328 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.089220 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.095480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm8bs\" (UniqueName: \"kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs\") pod \"heat-db-sync-xrnbs\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.175193 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l62vj"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200577 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200727 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsjz2\" (UniqueName: \"kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200774 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gds\" (UniqueName: \"kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.200979 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.240544 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.240907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.261978 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pf6rx"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.264562 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.273125 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.274104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.319013 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.319321 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.319458 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wrtv7" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.334735 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.360702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.360775 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.360824 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.360885 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gds\" (UniqueName: \"kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.361043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v66v\" (UniqueName: \"kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.361093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.361756 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsjz2\" (UniqueName: \"kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.366022 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.369088 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle\") pod \"cinder-db-sync-r6jzk\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.401091 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.422839 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pf6rx"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.431597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.463187 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.463361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.463595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v66v\" (UniqueName: \"kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.469999 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gds\" (UniqueName: \"kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds\") pod \"barbican-db-sync-l62vj\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.470696 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.479008 4713 generic.go:334] "Generic (PLEG): container finished" podID="c25cc920-5615-46b7-bca6-c0614071eddd" containerID="7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4" exitCode=0 Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.479173 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" event={"ID":"c25cc920-5615-46b7-bca6-c0614071eddd","Type":"ContainerDied","Data":"7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4"} Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.485040 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.486056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.490133 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.522165 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.528952 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v66v\" (UniqueName: \"kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v\") pod \"neutron-db-sync-pf6rx\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.545363 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pzsx4"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.546927 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.549331 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.549616 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mp8h" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.549762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.601252 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.603091 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.626525 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pzsx4"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.644791 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.680901 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681122 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtgp\" (UniqueName: \"kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681223 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681298 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxd6s\" (UniqueName: \"kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681376 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681395 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.681530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.694583 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.694690 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.701499 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.706464 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.749868 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.751726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.759000 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kknmp" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.759186 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.759318 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.759537 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.770308 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783281 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783315 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783334 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783365 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783387 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9gsb\" (UniqueName: \"kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783404 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv89k\" (UniqueName: \"kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783435 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783470 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783491 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783514 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.783546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787431 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787713 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787765 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787792 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787828 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtgp\" (UniqueName: \"kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787879 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787916 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.787970 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788040 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788079 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxd6s\" (UniqueName: \"kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788112 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788127 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.788788 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.790105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.790450 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.791058 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.792677 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.820152 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.842800 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.856928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxd6s\" (UniqueName: \"kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.875993 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtgp\" (UniqueName: \"kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp\") pod \"dnsmasq-dns-56df8fb6b7-7ksbx\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.887568 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle\") pod \"placement-db-sync-pzsx4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900760 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900827 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900895 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.900986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901021 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901066 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901098 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901115 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9gsb\" (UniqueName: \"kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901134 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv89k\" (UniqueName: \"kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901162 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.901193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.906076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.918337 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.918953 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.918979 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/364e1a6e25afa18dd00146c1f7173dd96ffacfd7742d763c37150bde38ea6657/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.919610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.919975 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.925007 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.951029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.954500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.955045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9gsb\" (UniqueName: \"kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.955093 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.956093 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.957320 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.959989 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.961172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.963899 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv89k\" (UniqueName: \"kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.966632 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.966951 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.968264 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:58 crc kubenswrapper[4713]: I0314 05:51:58.996391 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4q4v\" (UniqueName: \"kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008801 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008837 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008880 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.008952 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.047947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts\") pod \"ceilometer-0\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " pod="openstack/ceilometer-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.069842 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " pod="openstack/glance-default-external-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.109799 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4q4v\" (UniqueName: \"kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.109863 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.109894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.109934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.109960 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.110001 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.110045 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.110098 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.125182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.125541 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.127797 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.127838 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/09d437df07f0cc980684bdc1a6436f63ebf1e68a215d57555012df4017c88ddd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.138317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.139965 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.177180 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.216088 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.226919 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4q4v\" (UniqueName: \"kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.227680 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8tk8h" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.233600 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xrnbs" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.251507 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-28qsr" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.259389 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.284749 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xzbbk" Mar 14 05:51:59 crc kubenswrapper[4713]: E0314 05:51:59.286949 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-conmon-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.295539 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l62vj" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.359868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wrtv7" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.365557 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.396341 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzsx4" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.402490 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.415154 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.434146 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.442257 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dcls9"] Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.460895 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.500539 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.501160 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.543860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dcls9" event={"ID":"71fb2f45-0f20-479f-a9c5-e7c12bf79988","Type":"ContainerStarted","Data":"a812fd5e5831e75ee373f2faa501e69ac2e8599f51775118d1783c691cf0e43d"} Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.691736 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.773378 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.843742 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.843805 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqkr4\" (UniqueName: \"kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.843901 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.843944 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.844065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.844108 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.896114 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4" (OuterVolumeSpecName: "kube-api-access-cqkr4") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "kube-api-access-cqkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:51:59 crc kubenswrapper[4713]: I0314 05:51:59.947165 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqkr4\" (UniqueName: \"kubernetes.io/projected/c25cc920-5615-46b7-bca6-c0614071eddd-kube-api-access-cqkr4\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.047197 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.048632 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.048997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") pod \"c25cc920-5615-46b7-bca6-c0614071eddd\" (UID: \"c25cc920-5615-46b7-bca6-c0614071eddd\") " Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.049853 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: W0314 05:52:00.049930 4713 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c25cc920-5615-46b7-bca6-c0614071eddd/volumes/kubernetes.io~configmap/dns-svc Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.049942 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.060902 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.124973 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config" (OuterVolumeSpecName: "config") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.139666 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c25cc920-5615-46b7-bca6-c0614071eddd" (UID: "c25cc920-5615-46b7-bca6-c0614071eddd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.161058 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.162136 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.162153 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.162179 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25cc920-5615-46b7-bca6-c0614071eddd-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.176251 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557792-f568x"] Mar 14 05:52:00 crc kubenswrapper[4713]: E0314 05:52:00.176768 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="init" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.176787 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="init" Mar 14 05:52:00 crc kubenswrapper[4713]: E0314 05:52:00.176801 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="dnsmasq-dns" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.176808 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="dnsmasq-dns" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.177516 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" containerName="dnsmasq-dns" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.178657 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.181338 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.182688 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.194010 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557792-f568x"] Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.196449 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.265277 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zs69\" (UniqueName: \"kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69\") pod \"auto-csr-approver-29557792-f568x\" (UID: \"06adc2e2-0e41-49dc-8deb-0674f50a77de\") " pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.387703 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zs69\" (UniqueName: \"kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69\") pod \"auto-csr-approver-29557792-f568x\" (UID: \"06adc2e2-0e41-49dc-8deb-0674f50a77de\") " pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.422345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zs69\" (UniqueName: \"kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69\") pod \"auto-csr-approver-29557792-f568x\" (UID: \"06adc2e2-0e41-49dc-8deb-0674f50a77de\") " pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.630957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" event={"ID":"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c","Type":"ContainerStarted","Data":"408aadd0a3c83460261bb641e5e8be499b82983ae8178267bd8cecda8a0732d3"} Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.639037 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-r6jzk"] Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.640035 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" event={"ID":"c25cc920-5615-46b7-bca6-c0614071eddd","Type":"ContainerDied","Data":"a04dab1981cab62232122db337ad18a3f2c2eb2353dc44d4e8894f896fbc7223"} Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.640088 4713 scope.go:117] "RemoveContainer" containerID="7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.640220 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pzmjg" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.700634 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dcls9" podStartSLOduration=3.700611692 podStartE2EDuration="3.700611692s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:00.690835751 +0000 UTC m=+1503.778745051" watchObservedRunningTime="2026-03-14 05:52:00.700611692 +0000 UTC m=+1503.788520992" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.716344 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.750094 4713 scope.go:117] "RemoveContainer" containerID="94b325cf3c820313394333cc4fe7ca687bb9e3aa3432d7c2f037608d31f3502c" Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.758868 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:52:00 crc kubenswrapper[4713]: I0314 05:52:00.781790 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pzmjg"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.031096 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l62vj"] Mar 14 05:52:01 crc kubenswrapper[4713]: W0314 05:52:01.102494 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f9b887e_a476_4d85_8fc0_695678cee457.slice/crio-df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74 WatchSource:0}: Error finding container df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74: Status 404 returned error can't find the container with id df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74 Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.105078 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xrnbs"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.258492 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:01 crc kubenswrapper[4713]: W0314 05:52:01.279990 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda9b3f3d_45ae_454f_9430_9b69a22a05b4.slice/crio-ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504 WatchSource:0}: Error finding container ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504: Status 404 returned error can't find the container with id ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504 Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.316019 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pzsx4"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.388450 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.533125 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pf6rx"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.690552 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25cc920-5615-46b7-bca6-c0614071eddd" path="/var/lib/kubelet/pods/c25cc920-5615-46b7-bca6-c0614071eddd/volumes" Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.724632 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.832620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzsx4" event={"ID":"da9b3f3d-45ae-454f-9430-9b69a22a05b4","Type":"ContainerStarted","Data":"ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504"} Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.848527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l62vj" event={"ID":"0cd7eedb-d5e4-4df8-9ff6-717989483135","Type":"ContainerStarted","Data":"79c22985875d145740a12bc8e903b6edd5b0256d8895b44e3305006c8a9e26e3"} Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.891469 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerStarted","Data":"47f33a188c51ad04822f9029e5c3b3f2d71f94c7819d4167392e415587099bbb"} Mar 14 05:52:01 crc kubenswrapper[4713]: I0314 05:52:01.949890 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.000230 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.039687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf6rx" event={"ID":"ea6b1099-f4ac-4540-b964-334be68df63a","Type":"ContainerStarted","Data":"97c8fad75e697e3abebd86a4b757a42841e6efabbaebcddeac3710e711cbb9b5"} Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.045554 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xrnbs" event={"ID":"3f9b887e-a476-4d85-8fc0-695678cee457","Type":"ContainerStarted","Data":"df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74"} Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.111179 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dcls9" event={"ID":"71fb2f45-0f20-479f-a9c5-e7c12bf79988","Type":"ContainerStarted","Data":"c477c82e94dd15fb3e06044eb41a97f13c532c496167fe75e233aa8daea3bc44"} Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.166004 4713 generic.go:334] "Generic (PLEG): container finished" podID="18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" containerID="6073bb4ed5bf36ea466c8893df6a8305f17725e044f444c37bd9d577689fb86c" exitCode=0 Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.166109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" event={"ID":"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c","Type":"ContainerDied","Data":"6073bb4ed5bf36ea466c8893df6a8305f17725e044f444c37bd9d577689fb86c"} Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.175640 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.178039 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-r6jzk" event={"ID":"7d3e039f-375f-411e-b265-f6188fc80d58","Type":"ContainerStarted","Data":"c06b75d18494115cc4eddd4c97d98336c64a2a616abecae7e291c4ed369cba31"} Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.186236 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:52:02 crc kubenswrapper[4713]: W0314 05:52:02.219328 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7595f4_466a_4aac_8b5d_0aa2551ef4a4.slice/crio-ca8b8a972789bd6c351e2e0ecb89b77f6b394a9570e3b0f56058f0e04e655d88 WatchSource:0}: Error finding container ca8b8a972789bd6c351e2e0ecb89b77f6b394a9570e3b0f56058f0e04e655d88: Status 404 returned error can't find the container with id ca8b8a972789bd6c351e2e0ecb89b77f6b394a9570e3b0f56058f0e04e655d88 Mar 14 05:52:02 crc kubenswrapper[4713]: I0314 05:52:02.416142 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557792-f568x"] Mar 14 05:52:02 crc kubenswrapper[4713]: W0314 05:52:02.473370 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06adc2e2_0e41_49dc_8deb_0674f50a77de.slice/crio-905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8 WatchSource:0}: Error finding container 905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8: Status 404 returned error can't find the container with id 905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8 Mar 14 05:52:02 crc kubenswrapper[4713]: E0314 05:52:02.628602 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.023732 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.052761 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.053234 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.053277 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzm4\" (UniqueName: \"kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.053325 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.053413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.053602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0\") pod \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\" (UID: \"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c\") " Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.082570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4" (OuterVolumeSpecName: "kube-api-access-bbzm4") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "kube-api-access-bbzm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.188713 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzm4\" (UniqueName: \"kubernetes.io/projected/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-kube-api-access-bbzm4\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.214930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.223676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" event={"ID":"18ab7ca9-f75c-4a70-b1eb-dc44f487b88c","Type":"ContainerDied","Data":"408aadd0a3c83460261bb641e5e8be499b82983ae8178267bd8cecda8a0732d3"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.223728 4713 scope.go:117] "RemoveContainer" containerID="6073bb4ed5bf36ea466c8893df6a8305f17725e044f444c37bd9d577689fb86c" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.223831 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-qzpck" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.255552 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf6rx" event={"ID":"ea6b1099-f4ac-4540-b964-334be68df63a","Type":"ContainerStarted","Data":"0fc9d4ad5f5c526af06bf8c1b7a8f59df89cb7702b3d4c67df33c2386b50eed1"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.259653 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerStarted","Data":"ca8b8a972789bd6c351e2e0ecb89b77f6b394a9570e3b0f56058f0e04e655d88"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.274224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557792-f568x" event={"ID":"06adc2e2-0e41-49dc-8deb-0674f50a77de","Type":"ContainerStarted","Data":"905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.280225 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pf6rx" podStartSLOduration=6.280190609 podStartE2EDuration="6.280190609s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:03.27457449 +0000 UTC m=+1506.362483790" watchObservedRunningTime="2026-03-14 05:52:03.280190609 +0000 UTC m=+1506.368099909" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.292073 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.296729 4713 generic.go:334] "Generic (PLEG): container finished" podID="05a7f9cd-9580-4525-b249-7ff75958b351" containerID="8402ea8a26db465723983dd05e69897060bc568f1ff0340827362cb3c06d0a91" exitCode=0 Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.297001 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" event={"ID":"05a7f9cd-9580-4525-b249-7ff75958b351","Type":"ContainerDied","Data":"8402ea8a26db465723983dd05e69897060bc568f1ff0340827362cb3c06d0a91"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.297041 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" event={"ID":"05a7f9cd-9580-4525-b249-7ff75958b351","Type":"ContainerStarted","Data":"f1e04b8c94e9d3a5a8072660f77ffae24d8fb09a8d30f4674ea6bfa9895db0d2"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.333366 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerStarted","Data":"0f3e7130024aeb6928e1559de180882837b1aa9a5a6a4f703e4111f41d02c48d"} Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.390497 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config" (OuterVolumeSpecName: "config") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.393734 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.402696 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.459753 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.498674 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.498996 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.510900 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" (UID: "18ab7ca9-f75c-4a70-b1eb-dc44f487b88c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.622195 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.661285 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:52:03 crc kubenswrapper[4713]: I0314 05:52:03.676277 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-qzpck"] Mar 14 05:52:04 crc kubenswrapper[4713]: I0314 05:52:04.375347 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerStarted","Data":"752bbd4a476739bc659635e8469ec6ebc4a584cab2c74ae676f2e96eeee846dc"} Mar 14 05:52:04 crc kubenswrapper[4713]: I0314 05:52:04.457365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" event={"ID":"05a7f9cd-9580-4525-b249-7ff75958b351","Type":"ContainerStarted","Data":"d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef"} Mar 14 05:52:04 crc kubenswrapper[4713]: I0314 05:52:04.459267 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:52:04 crc kubenswrapper[4713]: I0314 05:52:04.477664 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerStarted","Data":"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb"} Mar 14 05:52:04 crc kubenswrapper[4713]: I0314 05:52:04.502570 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" podStartSLOduration=6.502550764 podStartE2EDuration="6.502550764s" podCreationTimestamp="2026-03-14 05:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:04.495986454 +0000 UTC m=+1507.583895774" watchObservedRunningTime="2026-03-14 05:52:04.502550764 +0000 UTC m=+1507.590460064" Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.522182 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerStarted","Data":"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74"} Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.522234 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-log" containerID="cri-o://4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" gracePeriod=30 Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.522296 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-httpd" containerID="cri-o://74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" gracePeriod=30 Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.529555 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerStarted","Data":"f4facb5a6e6a2d16bc6e231d009a9e8b5c64b624c5c4cf3206d44f0926366b1a"} Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.529747 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-log" containerID="cri-o://752bbd4a476739bc659635e8469ec6ebc4a584cab2c74ae676f2e96eeee846dc" gracePeriod=30 Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.530083 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-httpd" containerID="cri-o://f4facb5a6e6a2d16bc6e231d009a9e8b5c64b624c5c4cf3206d44f0926366b1a" gracePeriod=30 Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.534126 4713 generic.go:334] "Generic (PLEG): container finished" podID="06adc2e2-0e41-49dc-8deb-0674f50a77de" containerID="9bf31c596e86879aa5ffd03aac8798a400347ce413f33b26b8eae8b77c71d596" exitCode=0 Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.534331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557792-f568x" event={"ID":"06adc2e2-0e41-49dc-8deb-0674f50a77de","Type":"ContainerDied","Data":"9bf31c596e86879aa5ffd03aac8798a400347ce413f33b26b8eae8b77c71d596"} Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.569556 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.56953202 podStartE2EDuration="8.56953202s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:05.558156348 +0000 UTC m=+1508.646065648" watchObservedRunningTime="2026-03-14 05:52:05.56953202 +0000 UTC m=+1508.657441320" Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.584854 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" path="/var/lib/kubelet/pods/18ab7ca9-f75c-4a70-b1eb-dc44f487b88c/volumes" Mar 14 05:52:05 crc kubenswrapper[4713]: I0314 05:52:05.601104 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.601081585 podStartE2EDuration="8.601081585s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:05.586529482 +0000 UTC m=+1508.674438802" watchObservedRunningTime="2026-03-14 05:52:05.601081585 +0000 UTC m=+1508.688990885" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.359885 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551490 4713 generic.go:334] "Generic (PLEG): container finished" podID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerID="74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" exitCode=0 Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551529 4713 generic.go:334] "Generic (PLEG): container finished" podID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerID="4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" exitCode=143 Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551524 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerDied","Data":"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74"} Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551593 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerDied","Data":"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb"} Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551608 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02","Type":"ContainerDied","Data":"47f33a188c51ad04822f9029e5c3b3f2d71f94c7819d4167392e415587099bbb"} Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551625 4713 scope.go:117] "RemoveContainer" containerID="74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.551510 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552109 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552330 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552545 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv89k\" (UniqueName: \"kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552684 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552755 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552787 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.552852 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run\") pod \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\" (UID: \"aa20008f-3c8d-4cb8-9f2e-651c7fab2e02\") " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.555080 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.555760 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs" (OuterVolumeSpecName: "logs") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.560831 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts" (OuterVolumeSpecName: "scripts") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.563307 4713 generic.go:334] "Generic (PLEG): container finished" podID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerID="f4facb5a6e6a2d16bc6e231d009a9e8b5c64b624c5c4cf3206d44f0926366b1a" exitCode=0 Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.563340 4713 generic.go:334] "Generic (PLEG): container finished" podID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerID="752bbd4a476739bc659635e8469ec6ebc4a584cab2c74ae676f2e96eeee846dc" exitCode=143 Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.564430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerDied","Data":"f4facb5a6e6a2d16bc6e231d009a9e8b5c64b624c5c4cf3206d44f0926366b1a"} Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.566227 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerDied","Data":"752bbd4a476739bc659635e8469ec6ebc4a584cab2c74ae676f2e96eeee846dc"} Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.566044 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k" (OuterVolumeSpecName: "kube-api-access-kv89k") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "kube-api-access-kv89k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.598484 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.598715 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810" (OuterVolumeSpecName: "glance") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.633449 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data" (OuterVolumeSpecName: "config-data") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.655145 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" (UID: "aa20008f-3c8d-4cb8-9f2e-651c7fab2e02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.657930 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.657967 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.657979 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.657993 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.658002 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv89k\" (UniqueName: \"kubernetes.io/projected/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-kube-api-access-kv89k\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.658049 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") on node \"crc\" " Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.658061 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.658070 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.697363 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.697504 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810") on node "crc" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.764232 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.959616 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:06 crc kubenswrapper[4713]: I0314 05:52:06.972577 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.012136 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:07 crc kubenswrapper[4713]: E0314 05:52:07.012905 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" containerName="init" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.012928 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" containerName="init" Mar 14 05:52:07 crc kubenswrapper[4713]: E0314 05:52:07.012957 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-log" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.012964 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-log" Mar 14 05:52:07 crc kubenswrapper[4713]: E0314 05:52:07.012976 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-httpd" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.012985 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-httpd" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.013423 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-log" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.013450 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ab7ca9-f75c-4a70-b1eb-dc44f487b88c" containerName="init" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.013468 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" containerName="glance-httpd" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.015246 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.026644 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.026892 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.030938 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180020 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwkk\" (UniqueName: \"kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180559 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180812 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.180870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.181111 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.181166 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.286541 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.286651 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.286791 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.286835 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.286999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwkk\" (UniqueName: \"kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.287027 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.287120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.287342 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.289001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.290862 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:07 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:07 crc kubenswrapper[4713]: > Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.296265 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.297117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.298978 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.299083 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.300418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.302036 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.302118 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/364e1a6e25afa18dd00146c1f7173dd96ffacfd7742d763c37150bde38ea6657/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.311036 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwkk\" (UniqueName: \"kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.443779 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " pod="openstack/glance-default-external-api-0" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.594062 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa20008f-3c8d-4cb8-9f2e-651c7fab2e02" path="/var/lib/kubelet/pods/aa20008f-3c8d-4cb8-9f2e-651c7fab2e02/volumes" Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.631943 4713 generic.go:334] "Generic (PLEG): container finished" podID="71fb2f45-0f20-479f-a9c5-e7c12bf79988" containerID="c477c82e94dd15fb3e06044eb41a97f13c532c496167fe75e233aa8daea3bc44" exitCode=0 Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.631992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dcls9" event={"ID":"71fb2f45-0f20-479f-a9c5-e7c12bf79988","Type":"ContainerDied","Data":"c477c82e94dd15fb3e06044eb41a97f13c532c496167fe75e233aa8daea3bc44"} Mar 14 05:52:07 crc kubenswrapper[4713]: I0314 05:52:07.664264 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.417340 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.494167 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.495152 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" containerID="cri-o://a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63" gracePeriod=10 Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.654963 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.694664 4713 generic.go:334] "Generic (PLEG): container finished" podID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerID="a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63" exitCode=0 Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.694798 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" event={"ID":"9fd5a7aa-b08a-47d4-a730-73b10501f049","Type":"ContainerDied","Data":"a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63"} Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.702586 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557792-f568x" event={"ID":"06adc2e2-0e41-49dc-8deb-0674f50a77de","Type":"ContainerDied","Data":"905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8"} Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.702640 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905331c1af49f2b1b7cb69a90714f07e5b5248e6a27215e5b74520422748abd8" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.702768 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557792-f568x" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.815401 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zs69\" (UniqueName: \"kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69\") pod \"06adc2e2-0e41-49dc-8deb-0674f50a77de\" (UID: \"06adc2e2-0e41-49dc-8deb-0674f50a77de\") " Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.825538 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69" (OuterVolumeSpecName: "kube-api-access-6zs69") pod "06adc2e2-0e41-49dc-8deb-0674f50a77de" (UID: "06adc2e2-0e41-49dc-8deb-0674f50a77de"). InnerVolumeSpecName "kube-api-access-6zs69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:09 crc kubenswrapper[4713]: I0314 05:52:09.917666 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zs69\" (UniqueName: \"kubernetes.io/projected/06adc2e2-0e41-49dc-8deb-0674f50a77de-kube-api-access-6zs69\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:09 crc kubenswrapper[4713]: E0314 05:52:09.940896 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd5a7aa_b08a_47d4_a730_73b10501f049.slice/crio-conmon-a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.732317 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.732407 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.732467 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.733689 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.733767 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419" gracePeriod=600 Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.763119 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557786-qtpnn"] Mar 14 05:52:10 crc kubenswrapper[4713]: I0314 05:52:10.773506 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557786-qtpnn"] Mar 14 05:52:11 crc kubenswrapper[4713]: I0314 05:52:11.396313 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 14 05:52:11 crc kubenswrapper[4713]: I0314 05:52:11.580297 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77944775-7eab-4ede-8a7d-31a489a29ae3" path="/var/lib/kubelet/pods/77944775-7eab-4ede-8a7d-31a489a29ae3/volumes" Mar 14 05:52:11 crc kubenswrapper[4713]: I0314 05:52:11.738991 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419" exitCode=0 Mar 14 05:52:11 crc kubenswrapper[4713]: I0314 05:52:11.739037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419"} Mar 14 05:52:16 crc kubenswrapper[4713]: I0314 05:52:16.396880 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 14 05:52:17 crc kubenswrapper[4713]: I0314 05:52:17.260070 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:17 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:17 crc kubenswrapper[4713]: > Mar 14 05:52:17 crc kubenswrapper[4713]: E0314 05:52:17.829675 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.229020 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.236184 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.422672 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423446 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423495 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htspb\" (UniqueName: \"kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4q4v\" (UniqueName: \"kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423659 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423779 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423807 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423860 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.423914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.424012 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.424070 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run\") pod \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\" (UID: \"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.424163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle\") pod \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\" (UID: \"71fb2f45-0f20-479f-a9c5-e7c12bf79988\") " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.427044 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.427406 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs" (OuterVolumeSpecName: "logs") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.430813 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb" (OuterVolumeSpecName: "kube-api-access-htspb") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "kube-api-access-htspb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.432445 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.432559 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts" (OuterVolumeSpecName: "scripts") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.432654 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts" (OuterVolumeSpecName: "scripts") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.435734 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v" (OuterVolumeSpecName: "kube-api-access-t4q4v") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "kube-api-access-t4q4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.453905 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034" (OuterVolumeSpecName: "glance") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.459913 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data" (OuterVolumeSpecName: "config-data") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.461376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.468303 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.473505 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "71fb2f45-0f20-479f-a9c5-e7c12bf79988" (UID: "71fb2f45-0f20-479f-a9c5-e7c12bf79988"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.498771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data" (OuterVolumeSpecName: "config-data") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527651 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4q4v\" (UniqueName: \"kubernetes.io/projected/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-kube-api-access-t4q4v\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527692 4713 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527702 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527713 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527720 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527729 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527738 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527748 4713 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527756 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527764 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527772 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71fb2f45-0f20-479f-a9c5-e7c12bf79988-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527796 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") on node \"crc\" " Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.527807 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htspb\" (UniqueName: \"kubernetes.io/projected/71fb2f45-0f20-479f-a9c5-e7c12bf79988-kube-api-access-htspb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.538227 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" (UID: "7d7595f4-466a-4aac-8b5d-0aa2551ef4a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.566920 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.567083 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034") on node "crc" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.630801 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.631566 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:18 crc kubenswrapper[4713]: E0314 05:52:18.636421 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 14 05:52:18 crc kubenswrapper[4713]: E0314 05:52:18.636581 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm8bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-xrnbs_openstack(3f9b887e-a476-4d85-8fc0-695678cee457): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:52:18 crc kubenswrapper[4713]: E0314 05:52:18.637777 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-xrnbs" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.856794 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dcls9" event={"ID":"71fb2f45-0f20-479f-a9c5-e7c12bf79988","Type":"ContainerDied","Data":"a812fd5e5831e75ee373f2faa501e69ac2e8599f51775118d1783c691cf0e43d"} Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.856833 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a812fd5e5831e75ee373f2faa501e69ac2e8599f51775118d1783c691cf0e43d" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.856840 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dcls9" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.863676 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.863676 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d7595f4-466a-4aac-8b5d-0aa2551ef4a4","Type":"ContainerDied","Data":"ca8b8a972789bd6c351e2e0ecb89b77f6b394a9570e3b0f56058f0e04e655d88"} Mar 14 05:52:18 crc kubenswrapper[4713]: E0314 05:52:18.865947 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-xrnbs" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" Mar 14 05:52:18 crc kubenswrapper[4713]: I0314 05:52:18.994886 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.030964 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.095699 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:19 crc kubenswrapper[4713]: E0314 05:52:19.097715 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-log" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.097763 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-log" Mar 14 05:52:19 crc kubenswrapper[4713]: E0314 05:52:19.097802 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71fb2f45-0f20-479f-a9c5-e7c12bf79988" containerName="keystone-bootstrap" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.097810 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="71fb2f45-0f20-479f-a9c5-e7c12bf79988" containerName="keystone-bootstrap" Mar 14 05:52:19 crc kubenswrapper[4713]: E0314 05:52:19.097845 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-httpd" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.097886 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-httpd" Mar 14 05:52:19 crc kubenswrapper[4713]: E0314 05:52:19.097917 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06adc2e2-0e41-49dc-8deb-0674f50a77de" containerName="oc" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.097923 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="06adc2e2-0e41-49dc-8deb-0674f50a77de" containerName="oc" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.098775 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="71fb2f45-0f20-479f-a9c5-e7c12bf79988" containerName="keystone-bootstrap" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.098807 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="06adc2e2-0e41-49dc-8deb-0674f50a77de" containerName="oc" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.098829 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-log" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.098871 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" containerName="glance-httpd" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.109347 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.111848 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.113841 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.121182 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.263785 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.263871 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.263928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7d6q\" (UniqueName: \"kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.263959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.264005 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.264039 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.264093 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.264128 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.340004 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dcls9"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.350783 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dcls9"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.365729 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.365817 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.365927 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7d6q\" (UniqueName: \"kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366183 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366233 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.366288 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.367258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.370890 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.370867 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.372879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.375145 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.375185 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/09d437df07f0cc980684bdc1a6436f63ebf1e68a215d57555012df4017c88ddd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.386411 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.387861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7d6q\" (UniqueName: \"kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.436316 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.444742 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-44c8n"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.447261 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.459094 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.459290 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.459456 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.459569 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z87lf" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.459697 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.474806 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-44c8n"] Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.570446 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.570518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.570551 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.570800 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.570919 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.571153 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bkt\" (UniqueName: \"kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.577894 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71fb2f45-0f20-479f-a9c5-e7c12bf79988" path="/var/lib/kubelet/pods/71fb2f45-0f20-479f-a9c5-e7c12bf79988/volumes" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.578678 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7595f4-466a-4aac-8b5d-0aa2551ef4a4" path="/var/lib/kubelet/pods/7d7595f4-466a-4aac-8b5d-0aa2551ef4a4/volumes" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.672753 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.672845 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.672941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bkt\" (UniqueName: \"kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.672995 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.673043 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.673076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.677161 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.677955 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.678003 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.678370 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.678442 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.703049 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bkt\" (UniqueName: \"kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt\") pod \"keystone-bootstrap-44c8n\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.733945 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:19 crc kubenswrapper[4713]: I0314 05:52:19.810580 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:19 crc kubenswrapper[4713]: E0314 05:52:19.995736 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:21 crc kubenswrapper[4713]: I0314 05:52:21.396891 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 14 05:52:21 crc kubenswrapper[4713]: I0314 05:52:21.397729 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:52:26 crc kubenswrapper[4713]: I0314 05:52:26.396791 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 14 05:52:27 crc kubenswrapper[4713]: I0314 05:52:27.267961 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:27 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:27 crc kubenswrapper[4713]: > Mar 14 05:52:30 crc kubenswrapper[4713]: E0314 05:52:30.398300 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:30 crc kubenswrapper[4713]: I0314 05:52:30.993059 4713 generic.go:334] "Generic (PLEG): container finished" podID="ea6b1099-f4ac-4540-b964-334be68df63a" containerID="0fc9d4ad5f5c526af06bf8c1b7a8f59df89cb7702b3d4c67df33c2386b50eed1" exitCode=0 Mar 14 05:52:30 crc kubenswrapper[4713]: I0314 05:52:30.993141 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf6rx" event={"ID":"ea6b1099-f4ac-4540-b964-334be68df63a","Type":"ContainerDied","Data":"0fc9d4ad5f5c526af06bf8c1b7a8f59df89cb7702b3d4c67df33c2386b50eed1"} Mar 14 05:52:31 crc kubenswrapper[4713]: I0314 05:52:31.396670 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Mar 14 05:52:32 crc kubenswrapper[4713]: E0314 05:52:32.606578 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:32 crc kubenswrapper[4713]: E0314 05:52:32.667289 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 14 05:52:32 crc kubenswrapper[4713]: E0314 05:52:32.667460 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh6h5d8h659h688h597h584h66bh695h569h5bh5dh5d8h58dhddh85h67h7bh5d5h599h594h669h647h677h54bhd4h5f5h655h5f9h54bhbbhc5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t9gsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1e410ef9-e81f-4b9f-b12b-46db77edeb7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:52:32 crc kubenswrapper[4713]: I0314 05:52:32.683632 4713 scope.go:117] "RemoveContainer" containerID="4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" Mar 14 05:52:33 crc kubenswrapper[4713]: I0314 05:52:33.206540 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.162728 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.163287 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsjz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-r6jzk_openstack(7d3e039f-375f-411e-b265-f6188fc80d58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.164517 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-r6jzk" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.609258 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.609460 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6gds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-l62vj_openstack(0cd7eedb-d5e4-4df8-9ff6-717989483135): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.610974 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-l62vj" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.720397 4713 scope.go:117] "RemoveContainer" containerID="74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.721509 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74\": container with ID starting with 74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74 not found: ID does not exist" containerID="74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.721545 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74"} err="failed to get container status \"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74\": rpc error: code = NotFound desc = could not find container \"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74\": container with ID starting with 74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74 not found: ID does not exist" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.721577 4713 scope.go:117] "RemoveContainer" containerID="4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" Mar 14 05:52:35 crc kubenswrapper[4713]: E0314 05:52:35.721990 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb\": container with ID starting with 4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb not found: ID does not exist" containerID="4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.722018 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb"} err="failed to get container status \"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb\": rpc error: code = NotFound desc = could not find container \"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb\": container with ID starting with 4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb not found: ID does not exist" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.722037 4713 scope.go:117] "RemoveContainer" containerID="74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.728526 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74"} err="failed to get container status \"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74\": rpc error: code = NotFound desc = could not find container \"74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74\": container with ID starting with 74b7e305f9871e107662ab0b76a0dec7d9b480f2d5cb5213a0a8023c654b1c74 not found: ID does not exist" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.728588 4713 scope.go:117] "RemoveContainer" containerID="4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.729012 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb"} err="failed to get container status \"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb\": rpc error: code = NotFound desc = could not find container \"4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb\": container with ID starting with 4ad343c53f580059eaa099a4ae482769db4fe0c2df1d1407877b4a7a112cfaeb not found: ID does not exist" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.729036 4713 scope.go:117] "RemoveContainer" containerID="0deb757193504738dea1bcbab0c00a2cad7d7bcdee1ff823b40c52d856730f6e" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.855457 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.868058 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:52:35 crc kubenswrapper[4713]: I0314 05:52:35.981283 4713 scope.go:117] "RemoveContainer" containerID="f4facb5a6e6a2d16bc6e231d009a9e8b5c64b624c5c4cf3206d44f0926366b1a" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034467 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glvnq\" (UniqueName: \"kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq\") pod \"9fd5a7aa-b08a-47d4-a730-73b10501f049\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034528 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb\") pod \"9fd5a7aa-b08a-47d4-a730-73b10501f049\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle\") pod \"ea6b1099-f4ac-4540-b964-334be68df63a\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034687 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config\") pod \"ea6b1099-f4ac-4540-b964-334be68df63a\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034720 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config\") pod \"9fd5a7aa-b08a-47d4-a730-73b10501f049\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034743 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc\") pod \"9fd5a7aa-b08a-47d4-a730-73b10501f049\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034891 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v66v\" (UniqueName: \"kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v\") pod \"ea6b1099-f4ac-4540-b964-334be68df63a\" (UID: \"ea6b1099-f4ac-4540-b964-334be68df63a\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.034923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb\") pod \"9fd5a7aa-b08a-47d4-a730-73b10501f049\" (UID: \"9fd5a7aa-b08a-47d4-a730-73b10501f049\") " Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.039514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq" (OuterVolumeSpecName: "kube-api-access-glvnq") pod "9fd5a7aa-b08a-47d4-a730-73b10501f049" (UID: "9fd5a7aa-b08a-47d4-a730-73b10501f049"). InnerVolumeSpecName "kube-api-access-glvnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.045756 4713 scope.go:117] "RemoveContainer" containerID="752bbd4a476739bc659635e8469ec6ebc4a584cab2c74ae676f2e96eeee846dc" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.057042 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v" (OuterVolumeSpecName: "kube-api-access-6v66v") pod "ea6b1099-f4ac-4540-b964-334be68df63a" (UID: "ea6b1099-f4ac-4540-b964-334be68df63a"). InnerVolumeSpecName "kube-api-access-6v66v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.059200 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pf6rx" event={"ID":"ea6b1099-f4ac-4540-b964-334be68df63a","Type":"ContainerDied","Data":"97c8fad75e697e3abebd86a4b757a42841e6efabbaebcddeac3710e711cbb9b5"} Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.059269 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c8fad75e697e3abebd86a4b757a42841e6efabbaebcddeac3710e711cbb9b5" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.059263 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pf6rx" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.065573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" event={"ID":"9fd5a7aa-b08a-47d4-a730-73b10501f049","Type":"ContainerDied","Data":"0e43763ac06eebf6279fe4e92d3271027125b0c081df2a3297a3dbdef6835dc7"} Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.065778 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-24qbb" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.074786 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerStarted","Data":"763c34625f57e59928cadd50fc88f976b057c9e843dd776c45f095e3d171f2f3"} Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.101326 4713 scope.go:117] "RemoveContainer" containerID="a5232079bc8b459c680cc65ce29e9e95c4d5a9cffd0f85b22f96c339bf438e63" Mar 14 05:52:36 crc kubenswrapper[4713]: E0314 05:52:36.101322 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-l62vj" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" Mar 14 05:52:36 crc kubenswrapper[4713]: E0314 05:52:36.101379 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-r6jzk" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.120550 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea6b1099-f4ac-4540-b964-334be68df63a" (UID: "ea6b1099-f4ac-4540-b964-334be68df63a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.126097 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config" (OuterVolumeSpecName: "config") pod "ea6b1099-f4ac-4540-b964-334be68df63a" (UID: "ea6b1099-f4ac-4540-b964-334be68df63a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.139179 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.139232 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6b1099-f4ac-4540-b964-334be68df63a-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.139248 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v66v\" (UniqueName: \"kubernetes.io/projected/ea6b1099-f4ac-4540-b964-334be68df63a-kube-api-access-6v66v\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.139262 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glvnq\" (UniqueName: \"kubernetes.io/projected/9fd5a7aa-b08a-47d4-a730-73b10501f049-kube-api-access-glvnq\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.159774 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fd5a7aa-b08a-47d4-a730-73b10501f049" (UID: "9fd5a7aa-b08a-47d4-a730-73b10501f049"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.173470 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fd5a7aa-b08a-47d4-a730-73b10501f049" (UID: "9fd5a7aa-b08a-47d4-a730-73b10501f049"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.188503 4713 scope.go:117] "RemoveContainer" containerID="ea46a39ac827800f8c57354eff9702ffd1f91e95da7c6bb89c50b12c73638fed" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.197730 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config" (OuterVolumeSpecName: "config") pod "9fd5a7aa-b08a-47d4-a730-73b10501f049" (UID: "9fd5a7aa-b08a-47d4-a730-73b10501f049"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.210339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fd5a7aa-b08a-47d4-a730-73b10501f049" (UID: "9fd5a7aa-b08a-47d4-a730-73b10501f049"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.241724 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.241757 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.241768 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.241779 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fd5a7aa-b08a-47d4-a730-73b10501f049-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.251510 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-44c8n"] Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.361432 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:52:36 crc kubenswrapper[4713]: W0314 05:52:36.377006 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae5595c_d4de_4db7_b410_d149afd0f6a1.slice/crio-687d638a944b8d5b372d1278252d0086c05e63ed0da68091ce9e6d4cf45a3c99 WatchSource:0}: Error finding container 687d638a944b8d5b372d1278252d0086c05e63ed0da68091ce9e6d4cf45a3c99: Status 404 returned error can't find the container with id 687d638a944b8d5b372d1278252d0086c05e63ed0da68091ce9e6d4cf45a3c99 Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.419803 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:52:36 crc kubenswrapper[4713]: I0314 05:52:36.430490 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-24qbb"] Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.154969 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-44c8n" event={"ID":"e724ed74-dc1e-43d9-84f2-e774c7d969bf","Type":"ContainerStarted","Data":"ad7930d216d96bbd0b6b2c8fe04e353dd5a9fd9f77f5b49eadf9bae2f0410bcf"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.155479 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-44c8n" event={"ID":"e724ed74-dc1e-43d9-84f2-e774c7d969bf","Type":"ContainerStarted","Data":"5ede0f6f3458b1f4fef5b32d37fad75a1fa932032273646c1da166be50cccfad"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.168858 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xrnbs" event={"ID":"3f9b887e-a476-4d85-8fc0-695678cee457","Type":"ContainerStarted","Data":"49496be45acfeac04b48718ccf5031d2ad533c9dc885cfa26d3bd6d13249bafa"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.202818 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerStarted","Data":"687d638a944b8d5b372d1278252d0086c05e63ed0da68091ce9e6d4cf45a3c99"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.206023 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:52:37 crc kubenswrapper[4713]: E0314 05:52:37.222069 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="init" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.222105 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="init" Mar 14 05:52:37 crc kubenswrapper[4713]: E0314 05:52:37.222132 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.222141 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" Mar 14 05:52:37 crc kubenswrapper[4713]: E0314 05:52:37.222156 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b1099-f4ac-4540-b964-334be68df63a" containerName="neutron-db-sync" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.222162 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b1099-f4ac-4540-b964-334be68df63a" containerName="neutron-db-sync" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.222375 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6b1099-f4ac-4540-b964-334be68df63a" containerName="neutron-db-sync" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.222389 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" containerName="dnsmasq-dns" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.223574 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.243147 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.311329 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:37 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:37 crc kubenswrapper[4713]: > Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.312094 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.336747 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerStarted","Data":"57cee5ab4a51db1af616270bd1f42f4369a82ee3d254e06ea9e8c414b14e8509"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.378049 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.390143 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749pf\" (UniqueName: \"kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.390315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.390433 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.390457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.411728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.425500 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.449814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.456697 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wrtv7" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.456904 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.457081 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.457313 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.491825 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzsx4" event={"ID":"da9b3f3d-45ae-454f-9430-9b69a22a05b4","Type":"ContainerStarted","Data":"6dbcdc373df2cd984cd4dca3c5a93735843372117e4ed11b91735b53b8e2fe99"} Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.515580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.516835 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.517100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749pf\" (UniqueName: \"kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.517224 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.517334 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.517362 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.530551 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.532189 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.535517 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.537059 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.538238 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-44c8n" podStartSLOduration=18.538224028 podStartE2EDuration="18.538224028s" podCreationTimestamp="2026-03-14 05:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:37.225327412 +0000 UTC m=+1540.313236702" watchObservedRunningTime="2026-03-14 05:52:37.538224028 +0000 UTC m=+1540.626133328" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.550739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.553502 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-xrnbs" podStartSLOduration=5.822238761 podStartE2EDuration="40.553482874s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="2026-03-14 05:52:01.135706341 +0000 UTC m=+1504.223615641" lastFinishedPulling="2026-03-14 05:52:35.866950454 +0000 UTC m=+1538.954859754" observedRunningTime="2026-03-14 05:52:37.286878512 +0000 UTC m=+1540.374787822" watchObservedRunningTime="2026-03-14 05:52:37.553482874 +0000 UTC m=+1540.641392174" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.568434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749pf\" (UniqueName: \"kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf\") pod \"dnsmasq-dns-6b7b667979-2b9cn\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.644064 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd5a7aa-b08a-47d4-a730-73b10501f049" path="/var/lib/kubelet/pods/9fd5a7aa-b08a-47d4-a730-73b10501f049/volumes" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.646505 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.646755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.654280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.654457 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.654513 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgnz\" (UniqueName: \"kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.662348 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.669474 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pzsx4" podStartSLOduration=5.351143249 podStartE2EDuration="39.669458968s" podCreationTimestamp="2026-03-14 05:51:58 +0000 UTC" firstStartedPulling="2026-03-14 05:52:01.340692001 +0000 UTC m=+1504.428601301" lastFinishedPulling="2026-03-14 05:52:35.65900772 +0000 UTC m=+1538.746917020" observedRunningTime="2026-03-14 05:52:37.552282216 +0000 UTC m=+1540.640191516" watchObservedRunningTime="2026-03-14 05:52:37.669458968 +0000 UTC m=+1540.757368268" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.699733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.757214 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.757318 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.757348 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgnz\" (UniqueName: \"kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.758019 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.758092 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.763597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.765136 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.785102 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgnz\" (UniqueName: \"kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.792061 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.797452 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config\") pod \"neutron-569ddb745b-tqpzz\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:37 crc kubenswrapper[4713]: I0314 05:52:37.850467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:38 crc kubenswrapper[4713]: W0314 05:52:38.431521 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf53d9ae_bf57_44b5_b3b4_3f61d3b03a0a.slice/crio-b39e465653c1b9d0444dcf6784e9eb52987bf418af5e58a7ca0dad9644b87285 WatchSource:0}: Error finding container b39e465653c1b9d0444dcf6784e9eb52987bf418af5e58a7ca0dad9644b87285: Status 404 returned error can't find the container with id b39e465653c1b9d0444dcf6784e9eb52987bf418af5e58a7ca0dad9644b87285 Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.443117 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.526811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerStarted","Data":"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e"} Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.545655 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerStarted","Data":"fe9f1c4270f728c656634153b98deef2c3986d1086d3ed248f07da3063174eb1"} Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.548841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" event={"ID":"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a","Type":"ContainerStarted","Data":"b39e465653c1b9d0444dcf6784e9eb52987bf418af5e58a7ca0dad9644b87285"} Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.560044 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerStarted","Data":"9d1a343a53ec0ab52b00862fd9c26a72290f2ed8da4865f4e2b714d30b83a19e"} Mar 14 05:52:38 crc kubenswrapper[4713]: W0314 05:52:38.788679 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e045597_8d40_424a_8982_8dbfb1e379e3.slice/crio-6e632b3ab7066094fd3339354e722a805d0c43cbb8db9d548dcf497f1eaed56c WatchSource:0}: Error finding container 6e632b3ab7066094fd3339354e722a805d0c43cbb8db9d548dcf497f1eaed56c: Status 404 returned error can't find the container with id 6e632b3ab7066094fd3339354e722a805d0c43cbb8db9d548dcf497f1eaed56c Mar 14 05:52:38 crc kubenswrapper[4713]: I0314 05:52:38.788772 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:52:39 crc kubenswrapper[4713]: I0314 05:52:39.580430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerStarted","Data":"a748e2389e0bf3125b924d80cd2d7eba269ec4cff23f1d8a7715009735a60964"} Mar 14 05:52:39 crc kubenswrapper[4713]: I0314 05:52:39.580682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerStarted","Data":"6e632b3ab7066094fd3339354e722a805d0c43cbb8db9d548dcf497f1eaed56c"} Mar 14 05:52:39 crc kubenswrapper[4713]: I0314 05:52:39.605786 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.605771315 podStartE2EDuration="33.605771315s" podCreationTimestamp="2026-03-14 05:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:39.604470454 +0000 UTC m=+1542.692379754" watchObservedRunningTime="2026-03-14 05:52:39.605771315 +0000 UTC m=+1542.693680615" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.197665 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.202599 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.208853 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.209045 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.240850 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304330 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304398 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxtn\" (UniqueName: \"kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304636 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304836 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.304899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.407614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408029 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408098 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408237 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408278 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxtn\" (UniqueName: \"kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.408498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.415811 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.415853 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.416164 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.416662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.419920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.420654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.435010 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxtn\" (UniqueName: \"kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn\") pod \"neutron-6c66c75585-lmbx8\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.546989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.606724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerStarted","Data":"f5c915a9b21e748c2c71a906bcd44f40862820332d35fd89b95c969705d5e576"} Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.607391 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.610527 4713 generic.go:334] "Generic (PLEG): container finished" podID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerID="d748fcddbd0e761755e65fcd606bdbe1ea6b82d32b30800d29233576b80e55ca" exitCode=0 Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.610606 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" event={"ID":"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a","Type":"ContainerDied","Data":"d748fcddbd0e761755e65fcd606bdbe1ea6b82d32b30800d29233576b80e55ca"} Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.619944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerStarted","Data":"c9698ecce9eddc0bcd717f7b57cf34fdd3e3de66254ab15ddef07410622a4221"} Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.629984 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-569ddb745b-tqpzz" podStartSLOduration=3.629945728 podStartE2EDuration="3.629945728s" podCreationTimestamp="2026-03-14 05:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:40.629252226 +0000 UTC m=+1543.717161526" watchObservedRunningTime="2026-03-14 05:52:40.629945728 +0000 UTC m=+1543.717855038" Mar 14 05:52:40 crc kubenswrapper[4713]: I0314 05:52:40.678809 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=22.678789784 podStartE2EDuration="22.678789784s" podCreationTimestamp="2026-03-14 05:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:40.668586339 +0000 UTC m=+1543.756495639" watchObservedRunningTime="2026-03-14 05:52:40.678789784 +0000 UTC m=+1543.766699084" Mar 14 05:52:40 crc kubenswrapper[4713]: E0314 05:52:40.930157 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.451934 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.633138 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/0.log" Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.635234 4713 generic.go:334] "Generic (PLEG): container finished" podID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerID="f5c915a9b21e748c2c71a906bcd44f40862820332d35fd89b95c969705d5e576" exitCode=1 Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.635278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerDied","Data":"f5c915a9b21e748c2c71a906bcd44f40862820332d35fd89b95c969705d5e576"} Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.636089 4713 scope.go:117] "RemoveContainer" containerID="f5c915a9b21e748c2c71a906bcd44f40862820332d35fd89b95c969705d5e576" Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.642695 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerStarted","Data":"b848052252f303b975bacdff3e566608159b40824a0864575145493205b03151"} Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.653417 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" event={"ID":"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a","Type":"ContainerStarted","Data":"e8df5c89ced32a4a9160e058153b13e74607fbd75974957690a8713fc8f32f04"} Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.653597 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:41 crc kubenswrapper[4713]: I0314 05:52:41.719725 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" podStartSLOduration=4.719701719 podStartE2EDuration="4.719701719s" podCreationTimestamp="2026-03-14 05:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:41.718107179 +0000 UTC m=+1544.806016479" watchObservedRunningTime="2026-03-14 05:52:41.719701719 +0000 UTC m=+1544.807611019" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.738759 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/1.log" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.755228 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/0.log" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.757792 4713 generic.go:334] "Generic (PLEG): container finished" podID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerID="026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d" exitCode=1 Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.759745 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerDied","Data":"026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d"} Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.759816 4713 scope.go:117] "RemoveContainer" containerID="f5c915a9b21e748c2c71a906bcd44f40862820332d35fd89b95c969705d5e576" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.762615 4713 scope.go:117] "RemoveContainer" containerID="026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d" Mar 14 05:52:42 crc kubenswrapper[4713]: E0314 05:52:42.765900 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-569ddb745b-tqpzz_openstack(6e045597-8d40-424a-8982-8dbfb1e379e3)\"" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.817244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerStarted","Data":"23cd3874d58987b3193d2239efbf9fb030e08b5be4f155204c4c7fc9692edcbd"} Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.817574 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerStarted","Data":"c1adca9c77949e566b093cb78b119a75d1add017dcbc51a6d10bd6e0f0845a63"} Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.819364 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.853339 4713 generic.go:334] "Generic (PLEG): container finished" podID="da9b3f3d-45ae-454f-9430-9b69a22a05b4" containerID="6dbcdc373df2cd984cd4dca3c5a93735843372117e4ed11b91735b53b8e2fe99" exitCode=0 Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.854390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzsx4" event={"ID":"da9b3f3d-45ae-454f-9430-9b69a22a05b4","Type":"ContainerDied","Data":"6dbcdc373df2cd984cd4dca3c5a93735843372117e4ed11b91735b53b8e2fe99"} Mar 14 05:52:42 crc kubenswrapper[4713]: I0314 05:52:42.866622 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c66c75585-lmbx8" podStartSLOduration=2.866601122 podStartE2EDuration="2.866601122s" podCreationTimestamp="2026-03-14 05:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:42.841348187 +0000 UTC m=+1545.929257487" watchObservedRunningTime="2026-03-14 05:52:42.866601122 +0000 UTC m=+1545.954510422" Mar 14 05:52:43 crc kubenswrapper[4713]: I0314 05:52:43.865965 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/1.log" Mar 14 05:52:43 crc kubenswrapper[4713]: I0314 05:52:43.867530 4713 scope.go:117] "RemoveContainer" containerID="026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d" Mar 14 05:52:43 crc kubenswrapper[4713]: E0314 05:52:43.867750 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-569ddb745b-tqpzz_openstack(6e045597-8d40-424a-8982-8dbfb1e379e3)\"" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" Mar 14 05:52:44 crc kubenswrapper[4713]: I0314 05:52:44.901276 4713 generic.go:334] "Generic (PLEG): container finished" podID="e724ed74-dc1e-43d9-84f2-e774c7d969bf" containerID="ad7930d216d96bbd0b6b2c8fe04e353dd5a9fd9f77f5b49eadf9bae2f0410bcf" exitCode=0 Mar 14 05:52:44 crc kubenswrapper[4713]: I0314 05:52:44.903472 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-44c8n" event={"ID":"e724ed74-dc1e-43d9-84f2-e774c7d969bf","Type":"ContainerDied","Data":"ad7930d216d96bbd0b6b2c8fe04e353dd5a9fd9f77f5b49eadf9bae2f0410bcf"} Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.210009 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzsx4" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.254489 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxd6s\" (UniqueName: \"kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s\") pod \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.254651 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle\") pod \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.254727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data\") pod \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.254785 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts\") pod \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.254841 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs\") pod \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\" (UID: \"da9b3f3d-45ae-454f-9430-9b69a22a05b4\") " Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.255345 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs" (OuterVolumeSpecName: "logs") pod "da9b3f3d-45ae-454f-9430-9b69a22a05b4" (UID: "da9b3f3d-45ae-454f-9430-9b69a22a05b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.255823 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9b3f3d-45ae-454f-9430-9b69a22a05b4-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.262806 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts" (OuterVolumeSpecName: "scripts") pod "da9b3f3d-45ae-454f-9430-9b69a22a05b4" (UID: "da9b3f3d-45ae-454f-9430-9b69a22a05b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.265851 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s" (OuterVolumeSpecName: "kube-api-access-xxd6s") pod "da9b3f3d-45ae-454f-9430-9b69a22a05b4" (UID: "da9b3f3d-45ae-454f-9430-9b69a22a05b4"). InnerVolumeSpecName "kube-api-access-xxd6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.291646 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da9b3f3d-45ae-454f-9430-9b69a22a05b4" (UID: "da9b3f3d-45ae-454f-9430-9b69a22a05b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.292440 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data" (OuterVolumeSpecName: "config-data") pod "da9b3f3d-45ae-454f-9430-9b69a22a05b4" (UID: "da9b3f3d-45ae-454f-9430-9b69a22a05b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.358517 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.358557 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.358571 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxd6s\" (UniqueName: \"kubernetes.io/projected/da9b3f3d-45ae-454f-9430-9b69a22a05b4-kube-api-access-xxd6s\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.358585 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9b3f3d-45ae-454f-9430-9b69a22a05b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.919982 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzsx4" event={"ID":"da9b3f3d-45ae-454f-9430-9b69a22a05b4","Type":"ContainerDied","Data":"ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504"} Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.920283 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd18964f814273be38c59844586207e5d6c81fe515ee46e36de2366e99e2504" Mar 14 05:52:45 crc kubenswrapper[4713]: I0314 05:52:45.920035 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzsx4" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.317127 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:52:46 crc kubenswrapper[4713]: E0314 05:52:46.317903 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9b3f3d-45ae-454f-9430-9b69a22a05b4" containerName="placement-db-sync" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.317927 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9b3f3d-45ae-454f-9430-9b69a22a05b4" containerName="placement-db-sync" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.318239 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9b3f3d-45ae-454f-9430-9b69a22a05b4" containerName="placement-db-sync" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.319443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.324943 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.325184 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.325333 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.325436 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mp8h" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.325533 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.351481 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377381 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377435 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377522 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377590 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377856 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rkz\" (UniqueName: \"kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.377952 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.480228 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.480296 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.480328 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.480390 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rkz\" (UniqueName: \"kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.481397 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.481536 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.481625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.482052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.491439 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.491820 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.491845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.492028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.492480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.499760 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rkz\" (UniqueName: \"kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz\") pod \"placement-b9cc97768-hg4ff\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.655721 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.935079 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f9b887e-a476-4d85-8fc0-695678cee457" containerID="49496be45acfeac04b48718ccf5031d2ad533c9dc885cfa26d3bd6d13249bafa" exitCode=0 Mar 14 05:52:46 crc kubenswrapper[4713]: I0314 05:52:46.935133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xrnbs" event={"ID":"3f9b887e-a476-4d85-8fc0-695678cee457","Type":"ContainerDied","Data":"49496be45acfeac04b48718ccf5031d2ad533c9dc885cfa26d3bd6d13249bafa"} Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.280093 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:47 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:47 crc kubenswrapper[4713]: > Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.595525 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.605255 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.605361 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.605585 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.605632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bkt\" (UniqueName: \"kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.606967 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.607081 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys\") pod \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\" (UID: \"e724ed74-dc1e-43d9-84f2-e774c7d969bf\") " Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.617347 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.617453 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts" (OuterVolumeSpecName: "scripts") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.617843 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.623553 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt" (OuterVolumeSpecName: "kube-api-access-l5bkt") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "kube-api-access-l5bkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.672548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.675178 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.693699 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.695349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data" (OuterVolumeSpecName: "config-data") pod "e724ed74-dc1e-43d9-84f2-e774c7d969bf" (UID: "e724ed74-dc1e-43d9-84f2-e774c7d969bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.704325 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719253 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719505 4713 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719616 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bkt\" (UniqueName: \"kubernetes.io/projected/e724ed74-dc1e-43d9-84f2-e774c7d969bf-kube-api-access-l5bkt\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719710 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719806 4713 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.719901 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e724ed74-dc1e-43d9-84f2-e774c7d969bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.741395 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.784544 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.788151 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="dnsmasq-dns" containerID="cri-o://d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef" gracePeriod=10 Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.808261 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.960059 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.972326 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerStarted","Data":"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39"} Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.980441 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerStarted","Data":"0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281"} Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.986922 4713 generic.go:334] "Generic (PLEG): container finished" podID="05a7f9cd-9580-4525-b249-7ff75958b351" containerID="d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef" exitCode=0 Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.987003 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" event={"ID":"05a7f9cd-9580-4525-b249-7ff75958b351","Type":"ContainerDied","Data":"d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef"} Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.991016 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-44c8n" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.991418 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-44c8n" event={"ID":"e724ed74-dc1e-43d9-84f2-e774c7d969bf","Type":"ContainerDied","Data":"5ede0f6f3458b1f4fef5b32d37fad75a1fa932032273646c1da166be50cccfad"} Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.991769 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ede0f6f3458b1f4fef5b32d37fad75a1fa932032273646c1da166be50cccfad" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.991792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 05:52:47 crc kubenswrapper[4713]: I0314 05:52:47.991918 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.141092 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a7f9cd_9580_4525_b249_7ff75958b351.slice/crio-conmon-d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a7f9cd_9580_4525_b249_7ff75958b351.slice/crio-d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode724ed74_dc1e_43d9_84f2_e774c7d969bf.slice/crio-5ede0f6f3458b1f4fef5b32d37fad75a1fa932032273646c1da166be50cccfad\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.148472 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.159937 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.330292 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439086 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439165 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvtgp\" (UniqueName: \"kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439341 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.439400 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config\") pod \"05a7f9cd-9580-4525-b249-7ff75958b351\" (UID: \"05a7f9cd-9580-4525-b249-7ff75958b351\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.447707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp" (OuterVolumeSpecName: "kube-api-access-nvtgp") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "kube-api-access-nvtgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.543230 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvtgp\" (UniqueName: \"kubernetes.io/projected/05a7f9cd-9580-4525-b249-7ff75958b351-kube-api-access-nvtgp\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.592248 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.604896 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config" (OuterVolumeSpecName: "config") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.621290 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.647684 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.647722 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.647736 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.648139 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.669856 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05a7f9cd-9580-4525-b249-7ff75958b351" (UID: "05a7f9cd-9580-4525-b249-7ff75958b351"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.750662 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.750699 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05a7f9cd-9580-4525-b249-7ff75958b351-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.761001 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xrnbs" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.785214 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69d88696fb-hfdtr"] Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.798200 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" containerName="heat-db-sync" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.798443 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" containerName="heat-db-sync" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.798536 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="dnsmasq-dns" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.798592 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="dnsmasq-dns" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.798658 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e724ed74-dc1e-43d9-84f2-e774c7d969bf" containerName="keystone-bootstrap" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.798736 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e724ed74-dc1e-43d9-84f2-e774c7d969bf" containerName="keystone-bootstrap" Mar 14 05:52:48 crc kubenswrapper[4713]: E0314 05:52:48.798852 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="init" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.798909 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="init" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.799350 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e724ed74-dc1e-43d9-84f2-e774c7d969bf" containerName="keystone-bootstrap" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.799432 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" containerName="heat-db-sync" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.799503 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" containerName="dnsmasq-dns" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.800362 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.801677 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69d88696fb-hfdtr"] Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.806653 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.806945 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z87lf" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.807080 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.807223 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.807530 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.807904 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.868330 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle\") pod \"3f9b887e-a476-4d85-8fc0-695678cee457\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.868375 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm8bs\" (UniqueName: \"kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs\") pod \"3f9b887e-a476-4d85-8fc0-695678cee457\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.868602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data\") pod \"3f9b887e-a476-4d85-8fc0-695678cee457\" (UID: \"3f9b887e-a476-4d85-8fc0-695678cee457\") " Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.893612 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs" (OuterVolumeSpecName: "kube-api-access-xm8bs") pod "3f9b887e-a476-4d85-8fc0-695678cee457" (UID: "3f9b887e-a476-4d85-8fc0-695678cee457"). InnerVolumeSpecName "kube-api-access-xm8bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.925479 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9b887e-a476-4d85-8fc0-695678cee457" (UID: "3f9b887e-a476-4d85-8fc0-695678cee457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970437 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-credential-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970490 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-public-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970535 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzwj6\" (UniqueName: \"kubernetes.io/projected/91752d56-0175-41ab-8cea-a8b7ab4c55cf-kube-api-access-dzwj6\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-combined-ca-bundle\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970636 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-scripts\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970654 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-config-data\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970685 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-internal-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970709 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-fernet-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970766 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:48 crc kubenswrapper[4713]: I0314 05:52:48.970777 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm8bs\" (UniqueName: \"kubernetes.io/projected/3f9b887e-a476-4d85-8fc0-695678cee457-kube-api-access-xm8bs\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.012523 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerStarted","Data":"52e10daf48f474de7b04b71da57a7ef18a8d5d49e5c13edbdd7d11398dad1f6e"} Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.032274 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" event={"ID":"05a7f9cd-9580-4525-b249-7ff75958b351","Type":"ContainerDied","Data":"f1e04b8c94e9d3a5a8072660f77ffae24d8fb09a8d30f4674ea6bfa9895db0d2"} Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.032332 4713 scope.go:117] "RemoveContainer" containerID="d185a15f99357520c8ff20db4acbb3592644177ab32be7d2a0b8ff06f5437eef" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.032486 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7ksbx" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.051721 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xrnbs" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.052121 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xrnbs" event={"ID":"3f9b887e-a476-4d85-8fc0-695678cee457","Type":"ContainerDied","Data":"df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74"} Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.052147 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df66ce89eaafd0f2a99b8f57a9394fb8055b36955bab4e4d50e778e56b774c74" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076217 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-combined-ca-bundle\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076267 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-scripts\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076308 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-config-data\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076342 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-internal-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076367 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-fernet-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076425 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-credential-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076447 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-public-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.076489 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzwj6\" (UniqueName: \"kubernetes.io/projected/91752d56-0175-41ab-8cea-a8b7ab4c55cf-kube-api-access-dzwj6\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.081169 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-fernet-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.081607 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-scripts\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.082781 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-config-data\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.083500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-public-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.085401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-credential-keys\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.085861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-internal-tls-certs\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.091850 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data" (OuterVolumeSpecName: "config-data") pod "3f9b887e-a476-4d85-8fc0-695678cee457" (UID: "3f9b887e-a476-4d85-8fc0-695678cee457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.103775 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91752d56-0175-41ab-8cea-a8b7ab4c55cf-combined-ca-bundle\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.105000 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzwj6\" (UniqueName: \"kubernetes.io/projected/91752d56-0175-41ab-8cea-a8b7ab4c55cf-kube-api-access-dzwj6\") pod \"keystone-69d88696fb-hfdtr\" (UID: \"91752d56-0175-41ab-8cea-a8b7ab4c55cf\") " pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.127465 4713 scope.go:117] "RemoveContainer" containerID="8402ea8a26db465723983dd05e69897060bc568f1ff0340827362cb3c06d0a91" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.127539 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.132499 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.147075 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7ksbx"] Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.181963 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9b887e-a476-4d85-8fc0-695678cee457-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.599581 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a7f9cd-9580-4525-b249-7ff75958b351" path="/var/lib/kubelet/pods/05a7f9cd-9580-4525-b249-7ff75958b351/volumes" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.737314 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.752311 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.755421 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.755523 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.850168 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.851254 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69d88696fb-hfdtr"] Mar 14 05:52:49 crc kubenswrapper[4713]: I0314 05:52:49.867037 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.074636 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerStarted","Data":"ae72f01107aa077345da42e895c2ca165408431f7b73013f1bc47b2dbd23a6b9"} Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.075701 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.075739 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.101629 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-r6jzk" event={"ID":"7d3e039f-375f-411e-b265-f6188fc80d58","Type":"ContainerStarted","Data":"a314a81594457aabe7d7b756ec1dc15d8bb66ba7e671bd1329baed963bb45a34"} Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.104523 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69d88696fb-hfdtr" event={"ID":"91752d56-0175-41ab-8cea-a8b7ab4c55cf","Type":"ContainerStarted","Data":"d055dbe2eea12390c586ad08f0bd5e6cf8e345e9a2b90a46335ab0ea33a7ac2d"} Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.108025 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b9cc97768-hg4ff" podStartSLOduration=4.108011679 podStartE2EDuration="4.108011679s" podCreationTimestamp="2026-03-14 05:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:50.10177095 +0000 UTC m=+1553.189680250" watchObservedRunningTime="2026-03-14 05:52:50.108011679 +0000 UTC m=+1553.195920979" Mar 14 05:52:50 crc kubenswrapper[4713]: I0314 05:52:50.142003 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-r6jzk" podStartSLOduration=5.6984489069999995 podStartE2EDuration="53.141975301s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="2026-03-14 05:52:00.750452179 +0000 UTC m=+1503.838361479" lastFinishedPulling="2026-03-14 05:52:48.193978573 +0000 UTC m=+1551.281887873" observedRunningTime="2026-03-14 05:52:50.117669258 +0000 UTC m=+1553.205578558" watchObservedRunningTime="2026-03-14 05:52:50.141975301 +0000 UTC m=+1553.229884601" Mar 14 05:52:50 crc kubenswrapper[4713]: E0314 05:52:50.988568 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25cc920_5615_46b7_bca6_c0614071eddd.slice/crio-7dd18c51b672d696cd92622d04ca50318ee14685dfb09a1c90d24305ff959bd4.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:52:51 crc kubenswrapper[4713]: I0314 05:52:51.156187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69d88696fb-hfdtr" event={"ID":"91752d56-0175-41ab-8cea-a8b7ab4c55cf","Type":"ContainerStarted","Data":"0752d56201d451587d407e3270d06fcc80628d144fa2aa75cd9cc85d0fd2bca5"} Mar 14 05:52:51 crc kubenswrapper[4713]: I0314 05:52:51.157429 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:52:51 crc kubenswrapper[4713]: I0314 05:52:51.186389 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69d88696fb-hfdtr" podStartSLOduration=3.186369578 podStartE2EDuration="3.186369578s" podCreationTimestamp="2026-03-14 05:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:52:51.177672861 +0000 UTC m=+1554.265582161" watchObservedRunningTime="2026-03-14 05:52:51.186369578 +0000 UTC m=+1554.274278868" Mar 14 05:52:52 crc kubenswrapper[4713]: I0314 05:52:52.176144 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l62vj" event={"ID":"0cd7eedb-d5e4-4df8-9ff6-717989483135","Type":"ContainerStarted","Data":"e1a96dcc5fb019e1db1c1eeabb3ecd04a2f32dd7153079efd4a1d3ada0c6bdd7"} Mar 14 05:52:52 crc kubenswrapper[4713]: I0314 05:52:52.201001 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-l62vj" podStartSLOduration=4.991072066 podStartE2EDuration="55.200983756s" podCreationTimestamp="2026-03-14 05:51:57 +0000 UTC" firstStartedPulling="2026-03-14 05:52:01.06818844 +0000 UTC m=+1504.156097740" lastFinishedPulling="2026-03-14 05:52:51.27810013 +0000 UTC m=+1554.366009430" observedRunningTime="2026-03-14 05:52:52.190719839 +0000 UTC m=+1555.278629159" watchObservedRunningTime="2026-03-14 05:52:52.200983756 +0000 UTC m=+1555.288893056" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.215939 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.216431 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.219807 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.225966 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.226190 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:52:54 crc kubenswrapper[4713]: I0314 05:52:54.226474 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 05:52:57 crc kubenswrapper[4713]: I0314 05:52:57.335485 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" probeResult="failure" output=< Mar 14 05:52:57 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:52:57 crc kubenswrapper[4713]: > Mar 14 05:52:57 crc kubenswrapper[4713]: I0314 05:52:57.575000 4713 scope.go:117] "RemoveContainer" containerID="026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d" Mar 14 05:52:58 crc kubenswrapper[4713]: I0314 05:52:58.270974 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cd7eedb-d5e4-4df8-9ff6-717989483135" containerID="e1a96dcc5fb019e1db1c1eeabb3ecd04a2f32dd7153079efd4a1d3ada0c6bdd7" exitCode=0 Mar 14 05:52:58 crc kubenswrapper[4713]: I0314 05:52:58.271156 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l62vj" event={"ID":"0cd7eedb-d5e4-4df8-9ff6-717989483135","Type":"ContainerDied","Data":"e1a96dcc5fb019e1db1c1eeabb3ecd04a2f32dd7153079efd4a1d3ada0c6bdd7"} Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.286549 4713 generic.go:334] "Generic (PLEG): container finished" podID="7d3e039f-375f-411e-b265-f6188fc80d58" containerID="a314a81594457aabe7d7b756ec1dc15d8bb66ba7e671bd1329baed963bb45a34" exitCode=0 Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.286671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-r6jzk" event={"ID":"7d3e039f-375f-411e-b265-f6188fc80d58","Type":"ContainerDied","Data":"a314a81594457aabe7d7b756ec1dc15d8bb66ba7e671bd1329baed963bb45a34"} Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.716378 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l62vj" Mar 14 05:52:59 crc kubenswrapper[4713]: E0314 05:52:59.727816 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.757451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle\") pod \"0cd7eedb-d5e4-4df8-9ff6-717989483135\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.757656 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data\") pod \"0cd7eedb-d5e4-4df8-9ff6-717989483135\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.757793 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gds\" (UniqueName: \"kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds\") pod \"0cd7eedb-d5e4-4df8-9ff6-717989483135\" (UID: \"0cd7eedb-d5e4-4df8-9ff6-717989483135\") " Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.778741 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds" (OuterVolumeSpecName: "kube-api-access-c6gds") pod "0cd7eedb-d5e4-4df8-9ff6-717989483135" (UID: "0cd7eedb-d5e4-4df8-9ff6-717989483135"). InnerVolumeSpecName "kube-api-access-c6gds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.786309 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0cd7eedb-d5e4-4df8-9ff6-717989483135" (UID: "0cd7eedb-d5e4-4df8-9ff6-717989483135"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.794574 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cd7eedb-d5e4-4df8-9ff6-717989483135" (UID: "0cd7eedb-d5e4-4df8-9ff6-717989483135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.860263 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gds\" (UniqueName: \"kubernetes.io/projected/0cd7eedb-d5e4-4df8-9ff6-717989483135-kube-api-access-c6gds\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.860298 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:52:59 crc kubenswrapper[4713]: I0314 05:52:59.860307 4713 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cd7eedb-d5e4-4df8-9ff6-717989483135-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.299327 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerStarted","Data":"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9"} Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.299411 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="ceilometer-notification-agent" containerID="cri-o://3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e" gracePeriod=30 Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.299489 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="sg-core" containerID="cri-o://5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39" gracePeriod=30 Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.299481 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="proxy-httpd" containerID="cri-o://86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9" gracePeriod=30 Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.299530 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.304956 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l62vj" event={"ID":"0cd7eedb-d5e4-4df8-9ff6-717989483135","Type":"ContainerDied","Data":"79c22985875d145740a12bc8e903b6edd5b0256d8895b44e3305006c8a9e26e3"} Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.305027 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c22985875d145740a12bc8e903b6edd5b0256d8895b44e3305006c8a9e26e3" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.305143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l62vj" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.309981 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/2.log" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.311274 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/1.log" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.312263 4713 generic.go:334] "Generic (PLEG): container finished" podID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerID="203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4" exitCode=1 Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.312299 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerDied","Data":"203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4"} Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.312377 4713 scope.go:117] "RemoveContainer" containerID="026c9cd5265e0ecd6746be03e428614bf5b60f9e0c9fafda55bfb6031d507c8d" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.313269 4713 scope.go:117] "RemoveContainer" containerID="203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4" Mar 14 05:53:00 crc kubenswrapper[4713]: E0314 05:53:00.313726 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-569ddb745b-tqpzz_openstack(6e045597-8d40-424a-8982-8dbfb1e379e3)\"" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.720266 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f98f48554-4fr2x"] Mar 14 05:53:00 crc kubenswrapper[4713]: E0314 05:53:00.721123 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" containerName="barbican-db-sync" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.721141 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" containerName="barbican-db-sync" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.721383 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" containerName="barbican-db-sync" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.722515 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.733865 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.734142 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xzbbk" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.734341 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.757652 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6c94555ff-7pxkm"] Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.764060 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.768887 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.781282 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f98f48554-4fr2x"] Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.835307 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6c94555ff-7pxkm"] Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data-custom\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904337 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data-custom\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904389 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63291b9-15a4-43c2-ba17-be0374c459b5-logs\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904414 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904438 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsp2\" (UniqueName: \"kubernetes.io/projected/b63291b9-15a4-43c2-ba17-be0374c459b5-kube-api-access-qtsp2\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904456 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f41851-9a76-4730-9535-113163dd38dc-logs\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-combined-ca-bundle\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904566 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zggqn\" (UniqueName: \"kubernetes.io/projected/d1f41851-9a76-4730-9535-113163dd38dc-kube-api-access-zggqn\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.904610 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-combined-ca-bundle\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.919751 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.922064 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:00 crc kubenswrapper[4713]: I0314 05:53:00.961684 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.014986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015339 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015372 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zggqn\" (UniqueName: \"kubernetes.io/projected/d1f41851-9a76-4730-9535-113163dd38dc-kube-api-access-zggqn\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-combined-ca-bundle\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015440 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data-custom\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015473 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data-custom\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgrm\" (UniqueName: \"kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63291b9-15a4-43c2-ba17-be0374c459b5-logs\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015582 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015603 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsp2\" (UniqueName: \"kubernetes.io/projected/b63291b9-15a4-43c2-ba17-be0374c459b5-kube-api-access-qtsp2\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f41851-9a76-4730-9535-113163dd38dc-logs\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015662 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015695 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.015773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-combined-ca-bundle\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.019310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63291b9-15a4-43c2-ba17-be0374c459b5-logs\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.033066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data-custom\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.033654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f41851-9a76-4730-9535-113163dd38dc-logs\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.034094 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-combined-ca-bundle\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.036692 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-config-data\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.036961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63291b9-15a4-43c2-ba17-be0374c459b5-combined-ca-bundle\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.041832 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data-custom\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.051065 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f41851-9a76-4730-9535-113163dd38dc-config-data\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.051813 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsp2\" (UniqueName: \"kubernetes.io/projected/b63291b9-15a4-43c2-ba17-be0374c459b5-kube-api-access-qtsp2\") pod \"barbican-worker-6c94555ff-7pxkm\" (UID: \"b63291b9-15a4-43c2-ba17-be0374c459b5\") " pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.066749 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zggqn\" (UniqueName: \"kubernetes.io/projected/d1f41851-9a76-4730-9535-113163dd38dc-kube-api-access-zggqn\") pod \"barbican-keystone-listener-6f98f48554-4fr2x\" (UID: \"d1f41851-9a76-4730-9535-113163dd38dc\") " pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.090200 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.109451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6c94555ff-7pxkm" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120425 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120577 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgrm\" (UniqueName: \"kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.120609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.121303 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.121495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.121849 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.122178 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.122395 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.168080 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.170445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.172137 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.173630 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.194892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgrm\" (UniqueName: \"kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm\") pod \"dnsmasq-dns-848cf88cfc-2vb44\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223020 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223109 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223182 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsjz2\" (UniqueName: \"kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223337 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223750 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vqq\" (UniqueName: \"kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223807 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.223972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.224193 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.237189 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2" (OuterVolumeSpecName: "kube-api-access-dsjz2") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "kube-api-access-dsjz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.244759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts" (OuterVolumeSpecName: "scripts") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.249718 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.259682 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.268081 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.325863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.326314 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id\") pod \"7d3e039f-375f-411e-b265-f6188fc80d58\" (UID: \"7d3e039f-375f-411e-b265-f6188fc80d58\") " Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.326812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.326891 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vqq\" (UniqueName: \"kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.326919 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.326990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.327097 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.327466 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.336327 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.338974 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.339005 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsjz2\" (UniqueName: \"kubernetes.io/projected/7d3e039f-375f-411e-b265-f6188fc80d58-kube-api-access-dsjz2\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.339276 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.339759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.339840 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data" (OuterVolumeSpecName: "config-data") pod "7d3e039f-375f-411e-b265-f6188fc80d58" (UID: "7d3e039f-375f-411e-b265-f6188fc80d58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.351713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.354364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.356485 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/2.log" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.356537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vqq\" (UniqueName: \"kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.363614 4713 generic.go:334] "Generic (PLEG): container finished" podID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerID="86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9" exitCode=0 Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.363656 4713 generic.go:334] "Generic (PLEG): container finished" podID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerID="5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39" exitCode=2 Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.363710 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerDied","Data":"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9"} Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.363744 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerDied","Data":"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39"} Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.365267 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data\") pod \"barbican-api-54566f9956-h7rfh\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.367182 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-r6jzk" event={"ID":"7d3e039f-375f-411e-b265-f6188fc80d58","Type":"ContainerDied","Data":"c06b75d18494115cc4eddd4c97d98336c64a2a616abecae7e291c4ed369cba31"} Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.367203 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c06b75d18494115cc4eddd4c97d98336c64a2a616abecae7e291c4ed369cba31" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.367271 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-r6jzk" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.441565 4713 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3e039f-375f-411e-b265-f6188fc80d58-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.441772 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.441786 4713 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d3e039f-375f-411e-b265-f6188fc80d58-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.514808 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.554178 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:01 crc kubenswrapper[4713]: E0314 05:53:01.573922 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" containerName="cinder-db-sync" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.573964 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" containerName="cinder-db-sync" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.582326 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" containerName="cinder-db-sync" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.610890 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.623939 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-28qsr" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.624284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.624516 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.624855 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784347 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qp8\" (UniqueName: \"kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784396 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784461 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784556 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784583 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.784666 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.818750 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.819082 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.822033 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.824412 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.886882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887063 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qp8\" (UniqueName: \"kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887119 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887146 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx8d\" (UniqueName: \"kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887177 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887234 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887264 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887341 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887351 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.887366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: E0314 05:53:01.897851 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3e039f_375f_411e_b265_f6188fc80d58.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.898528 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.907113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.907819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.907856 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qp8\" (UniqueName: \"kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.909794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.912952 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts\") pod \"cinder-scheduler-0\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.946177 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.949188 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.955955 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.962928 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knd82\" (UniqueName: \"kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988811 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988873 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988922 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpx8d\" (UniqueName: \"kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988946 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.988999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989046 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989076 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989103 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989119 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.989734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.990101 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.990881 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.991094 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:01 crc kubenswrapper[4713]: I0314 05:53:01.992944 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.013521 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpx8d\" (UniqueName: \"kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d\") pod \"dnsmasq-dns-6578955fd5-fj45w\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090310 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090365 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090414 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090457 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knd82\" (UniqueName: \"kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090515 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090520 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.091145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.090538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.094633 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.095816 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.096131 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.098031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.110347 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knd82\" (UniqueName: \"kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82\") pod \"cinder-api-0\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.123289 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.165770 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.221979 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6c94555ff-7pxkm"] Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.281135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.284149 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f98f48554-4fr2x"] Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.339603 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:02 crc kubenswrapper[4713]: W0314 05:53:02.386630 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82eb7f84_26de_44cf_9a23_d7b9b7d50323.slice/crio-4d905f1c056f01fd8be6586b372c0beb45ee29f413459d94b2559dd5d743f6b4 WatchSource:0}: Error finding container 4d905f1c056f01fd8be6586b372c0beb45ee29f413459d94b2559dd5d743f6b4: Status 404 returned error can't find the container with id 4d905f1c056f01fd8be6586b372c0beb45ee29f413459d94b2559dd5d743f6b4 Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.408305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c94555ff-7pxkm" event={"ID":"b63291b9-15a4-43c2-ba17-be0374c459b5","Type":"ContainerStarted","Data":"6dcfb91e35fe8a37b4ede89496a48ba1b9f9ccda2ce4300f3c7ddf2cac4752c5"} Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.419910 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" event={"ID":"d1f41851-9a76-4730-9535-113163dd38dc","Type":"ContainerStarted","Data":"032bbda546cce218de2027123a8ad993dbf13dc4dd10ee938737baae8f3ce34d"} Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.529874 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:02 crc kubenswrapper[4713]: W0314 05:53:02.562365 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a9d55e_79c2_4a43_af43_4c9213e93501.slice/crio-2ebf301543321a9b6245b87a71b9695f25587242c9534fa50385d5a696c72972 WatchSource:0}: Error finding container 2ebf301543321a9b6245b87a71b9695f25587242c9534fa50385d5a696c72972: Status 404 returned error can't find the container with id 2ebf301543321a9b6245b87a71b9695f25587242c9534fa50385d5a696c72972 Mar 14 05:53:02 crc kubenswrapper[4713]: I0314 05:53:02.811634 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.185990 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.266012 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.547955 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerStarted","Data":"41b1db5ad83e7c6047fa94535ac12eb197fad8feae25f04d65b343f000c6a301"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.547991 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerStarted","Data":"2ebf301543321a9b6245b87a71b9695f25587242c9534fa50385d5a696c72972"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.553375 4713 generic.go:334] "Generic (PLEG): container finished" podID="82eb7f84-26de-44cf-9a23-d7b9b7d50323" containerID="7a13ed09100b11063be6372e7606dd9a7a437c1b496781e71c8642e289bc5cc6" exitCode=0 Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.553482 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" event={"ID":"82eb7f84-26de-44cf-9a23-d7b9b7d50323","Type":"ContainerDied","Data":"7a13ed09100b11063be6372e7606dd9a7a437c1b496781e71c8642e289bc5cc6"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.553514 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" event={"ID":"82eb7f84-26de-44cf-9a23-d7b9b7d50323","Type":"ContainerStarted","Data":"4d905f1c056f01fd8be6586b372c0beb45ee29f413459d94b2559dd5d743f6b4"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.560417 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerStarted","Data":"4ad836fb6b9e13e67c2e2ba7ea17e7aea7020bbe756fa0150f8ca509535b927b"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.609687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerStarted","Data":"f6bf420b6f12e65cde96ec9a5604b21f9443890c28a8e3addcd0f472e8a02e07"} Mar 14 05:53:03 crc kubenswrapper[4713]: I0314 05:53:03.610030 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerStarted","Data":"91aa6d2909fec99c84c4297c6c8fa7e572ae38bb71b0bb7d6e0ee98683c6de63"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.165439 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.294425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.294716 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.294843 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.294996 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.295092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.295158 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsgrm\" (UniqueName: \"kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm\") pod \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\" (UID: \"82eb7f84-26de-44cf-9a23-d7b9b7d50323\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.311964 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm" (OuterVolumeSpecName: "kube-api-access-nsgrm") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "kube-api-access-nsgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.355760 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.356586 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.381077 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.399658 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.400146 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsgrm\" (UniqueName: \"kubernetes.io/projected/82eb7f84-26de-44cf-9a23-d7b9b7d50323-kube-api-access-nsgrm\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.400526 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.400647 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.405555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.443637 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config" (OuterVolumeSpecName: "config") pod "82eb7f84-26de-44cf-9a23-d7b9b7d50323" (UID: "82eb7f84-26de-44cf-9a23-d7b9b7d50323"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.503154 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.503185 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82eb7f84-26de-44cf-9a23-d7b9b7d50323-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.628508 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerStarted","Data":"9c2e9acef78bdf0d82a2dca07b76f499e6bcb16689c3674ce8a2cfa79e361adc"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.629357 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.629402 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.639139 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.639766 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2vb44" event={"ID":"82eb7f84-26de-44cf-9a23-d7b9b7d50323","Type":"ContainerDied","Data":"4d905f1c056f01fd8be6586b372c0beb45ee29f413459d94b2559dd5d743f6b4"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.639914 4713 scope.go:117] "RemoveContainer" containerID="7a13ed09100b11063be6372e7606dd9a7a437c1b496781e71c8642e289bc5cc6" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.661084 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerStarted","Data":"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.663083 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.672590 4713 generic.go:334] "Generic (PLEG): container finished" podID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerID="3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e" exitCode=0 Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.672643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerDied","Data":"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.672673 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e410ef9-e81f-4b9f-b12b-46db77edeb7e","Type":"ContainerDied","Data":"0f3e7130024aeb6928e1559de180882837b1aa9a5a6a4f703e4111f41d02c48d"} Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706398 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706494 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706583 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706629 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9gsb\" (UniqueName: \"kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.706657 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd\") pod \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\" (UID: \"1e410ef9-e81f-4b9f-b12b-46db77edeb7e\") " Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.718978 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.720940 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.728898 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts" (OuterVolumeSpecName: "scripts") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.732151 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb" (OuterVolumeSpecName: "kube-api-access-t9gsb") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "kube-api-access-t9gsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.789842 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.808352 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54566f9956-h7rfh" podStartSLOduration=3.808335103 podStartE2EDuration="3.808335103s" podCreationTimestamp="2026-03-14 05:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:04.657770778 +0000 UTC m=+1567.745680098" watchObservedRunningTime="2026-03-14 05:53:04.808335103 +0000 UTC m=+1567.896244403" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.809957 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.809992 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.810007 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.810020 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9gsb\" (UniqueName: \"kubernetes.io/projected/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-kube-api-access-t9gsb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.810031 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.821534 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.836865 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2vb44"] Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.867399 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.887745 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data" (OuterVolumeSpecName: "config-data") pod "1e410ef9-e81f-4b9f-b12b-46db77edeb7e" (UID: "1e410ef9-e81f-4b9f-b12b-46db77edeb7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.919776 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:04 crc kubenswrapper[4713]: I0314 05:53:04.919814 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e410ef9-e81f-4b9f-b12b-46db77edeb7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.189532 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.392289 4713 scope.go:117] "RemoveContainer" containerID="86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.476750 4713 scope.go:117] "RemoveContainer" containerID="5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.509422 4713 scope.go:117] "RemoveContainer" containerID="3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.540447 4713 scope.go:117] "RemoveContainer" containerID="86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.540935 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9\": container with ID starting with 86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9 not found: ID does not exist" containerID="86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.540975 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9"} err="failed to get container status \"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9\": rpc error: code = NotFound desc = could not find container \"86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9\": container with ID starting with 86447df88be852a4c19025bfeee9015be11f31d4a960aced0534efffb6b1fef9 not found: ID does not exist" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.541005 4713 scope.go:117] "RemoveContainer" containerID="5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.541473 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39\": container with ID starting with 5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39 not found: ID does not exist" containerID="5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.541504 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39"} err="failed to get container status \"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39\": rpc error: code = NotFound desc = could not find container \"5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39\": container with ID starting with 5e80ee46c8450575d1ccd02c94fc70de30cadf0e758269e088e7d6b97f5fdc39 not found: ID does not exist" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.541521 4713 scope.go:117] "RemoveContainer" containerID="3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.542677 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e\": container with ID starting with 3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e not found: ID does not exist" containerID="3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.542742 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e"} err="failed to get container status \"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e\": rpc error: code = NotFound desc = could not find container \"3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e\": container with ID starting with 3fe6738ccd6307eaeb55b5699e36e6c4d82123d6674c6ad5046fac00f71f3a5e not found: ID does not exist" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.587161 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82eb7f84-26de-44cf-9a23-d7b9b7d50323" path="/var/lib/kubelet/pods/82eb7f84-26de-44cf-9a23-d7b9b7d50323/volumes" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.683360 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.686633 4713 generic.go:334] "Generic (PLEG): container finished" podID="02ae920c-9439-4c60-904b-bea08ca59dac" containerID="7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b" exitCode=0 Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.686712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerDied","Data":"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b"} Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.931403 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.979286 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.996380 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.996976 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="ceilometer-notification-agent" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.996996 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="ceilometer-notification-agent" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.997012 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82eb7f84-26de-44cf-9a23-d7b9b7d50323" containerName="init" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997018 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="82eb7f84-26de-44cf-9a23-d7b9b7d50323" containerName="init" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.997058 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="proxy-httpd" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997066 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="proxy-httpd" Mar 14 05:53:05 crc kubenswrapper[4713]: E0314 05:53:05.997082 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="sg-core" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997088 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="sg-core" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997306 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="ceilometer-notification-agent" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997332 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="proxy-httpd" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997341 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="82eb7f84-26de-44cf-9a23-d7b9b7d50323" containerName="init" Mar 14 05:53:05 crc kubenswrapper[4713]: I0314 05:53:05.997354 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" containerName="sg-core" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.006890 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.016831 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.021780 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.023856 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.054861 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx78h\" (UniqueName: \"kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.054952 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.054997 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.055065 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.055245 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.055333 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.055364 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx78h\" (UniqueName: \"kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157586 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157618 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157735 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157782 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.157799 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.158651 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.158763 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.161927 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.162578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.172118 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.172249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.186942 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx78h\" (UniqueName: \"kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h\") pod \"ceilometer-0\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.283350 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.349302 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.350680 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.531172 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.712174 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerStarted","Data":"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b"} Mar 14 05:53:06 crc kubenswrapper[4713]: I0314 05:53:06.723543 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerStarted","Data":"b5f8077200584a3eab19484274276e4e9756923899697e691e5672f787622409"} Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.587506 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e410ef9-e81f-4b9f-b12b-46db77edeb7e" path="/var/lib/kubelet/pods/1e410ef9-e81f-4b9f-b12b-46db77edeb7e/volumes" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.627347 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:07 crc kubenswrapper[4713]: W0314 05:53:07.630072 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d19e2d1_7c0e_4199_ac1d_c1cb9fc56d3b.slice/crio-66a805e04b9b24f6b6f7905ca2ea6d10bcdf340b1090eed2e0c5e0592b998009 WatchSource:0}: Error finding container 66a805e04b9b24f6b6f7905ca2ea6d10bcdf340b1090eed2e0c5e0592b998009: Status 404 returned error can't find the container with id 66a805e04b9b24f6b6f7905ca2ea6d10bcdf340b1090eed2e0c5e0592b998009 Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.774959 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c94555ff-7pxkm" event={"ID":"b63291b9-15a4-43c2-ba17-be0374c459b5","Type":"ContainerStarted","Data":"8717e2bdcb7b3dcaa76d39bb6975895560ecc98bb28724804372ba616a37c3a0"} Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.795854 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" event={"ID":"d1f41851-9a76-4730-9535-113163dd38dc","Type":"ContainerStarted","Data":"d77c112f19b0e5dd739ac3f97f42aeea5b895091274dcbf3f02cee4305a51f32"} Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.799359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerStarted","Data":"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78"} Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.800499 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.805829 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fps7b" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" containerID="cri-o://8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a" gracePeriod=2 Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.805953 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerStarted","Data":"66a805e04b9b24f6b6f7905ca2ea6d10bcdf340b1090eed2e0c5e0592b998009"} Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.846082 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" podStartSLOduration=6.846056343 podStartE2EDuration="6.846056343s" podCreationTimestamp="2026-03-14 05:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:07.825939972 +0000 UTC m=+1570.913849272" watchObservedRunningTime="2026-03-14 05:53:07.846056343 +0000 UTC m=+1570.933965643" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.851713 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.852285 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.852360 4713 scope.go:117] "RemoveContainer" containerID="203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4" Mar 14 05:53:07 crc kubenswrapper[4713]: E0314 05:53:07.852633 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-569ddb745b-tqpzz_openstack(6e045597-8d40-424a-8982-8dbfb1e379e3)\"" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" Mar 14 05:53:07 crc kubenswrapper[4713]: I0314 05:53:07.858865 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.200:9696/\": dial tcp 10.217.0.200:9696: connect: connection refused" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.036368 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66dc7cf97d-ll6gt"] Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.038708 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.046611 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.046800 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.054892 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66dc7cf97d-ll6gt"] Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.109883 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-internal-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110317 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6dw\" (UniqueName: \"kubernetes.io/projected/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-kube-api-access-fj6dw\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110358 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data-custom\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110389 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-public-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110452 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-combined-ca-bundle\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110474 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.110499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-logs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213502 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6dw\" (UniqueName: \"kubernetes.io/projected/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-kube-api-access-fj6dw\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213583 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data-custom\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213617 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-public-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213692 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-combined-ca-bundle\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213724 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-logs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.213794 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-internal-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.218694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-public-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.219719 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-logs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.223902 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-combined-ca-bundle\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.226293 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.239399 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-internal-tls-certs\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.240164 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-config-data-custom\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.242990 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6dw\" (UniqueName: \"kubernetes.io/projected/c0226c41-0d23-4ea8-b8ff-0f1b20a04f68-kube-api-access-fj6dw\") pod \"barbican-api-66dc7cf97d-ll6gt\" (UID: \"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68\") " pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.440886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.586745 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.641487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content\") pod \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.641568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnbmp\" (UniqueName: \"kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp\") pod \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.641984 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities\") pod \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\" (UID: \"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6\") " Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.652824 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities" (OuterVolumeSpecName: "utilities") pod "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" (UID: "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.766181 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp" (OuterVolumeSpecName: "kube-api-access-xnbmp") pod "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" (UID: "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6"). InnerVolumeSpecName "kube-api-access-xnbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.791946 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.791991 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnbmp\" (UniqueName: \"kubernetes.io/projected/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-kube-api-access-xnbmp\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.847766 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerStarted","Data":"994ee36d06ca632ff30863cbd9a2aa7b624d5f4c65a4267928cdab75b1474f56"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.848359 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.848269 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api" containerID="cri-o://994ee36d06ca632ff30863cbd9a2aa7b624d5f4c65a4267928cdab75b1474f56" gracePeriod=30 Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.847950 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api-log" containerID="cri-o://b5f8077200584a3eab19484274276e4e9756923899697e691e5672f787622409" gracePeriod=30 Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.862415 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" event={"ID":"d1f41851-9a76-4730-9535-113163dd38dc","Type":"ContainerStarted","Data":"e46b315169d65855c53d0b0f6735ea76af2fcc7f3cb1b1598d1b8ce346ff3437"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.876128 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6c94555ff-7pxkm" event={"ID":"b63291b9-15a4-43c2-ba17-be0374c459b5","Type":"ContainerStarted","Data":"3868c8575f5179474557f90f304ef8c1c8dcf6f7e4fdbc24ec84d4bbfb954d88"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.904135 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.9041195250000005 podStartE2EDuration="7.904119525s" podCreationTimestamp="2026-03-14 05:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:08.896378079 +0000 UTC m=+1571.984287389" watchObservedRunningTime="2026-03-14 05:53:08.904119525 +0000 UTC m=+1571.992028825" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.916870 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerStarted","Data":"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.936546 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" (UID: "8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.943529 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f98f48554-4fr2x" podStartSLOduration=4.155091265 podStartE2EDuration="8.943499719s" podCreationTimestamp="2026-03-14 05:53:00 +0000 UTC" firstStartedPulling="2026-03-14 05:53:02.35570143 +0000 UTC m=+1565.443610740" lastFinishedPulling="2026-03-14 05:53:07.144109854 +0000 UTC m=+1570.232019194" observedRunningTime="2026-03-14 05:53:08.922322535 +0000 UTC m=+1572.010231835" watchObservedRunningTime="2026-03-14 05:53:08.943499719 +0000 UTC m=+1572.031409029" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.963596 4713 generic.go:334] "Generic (PLEG): container finished" podID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerID="8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a" exitCode=0 Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.964082 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6c94555ff-7pxkm" podStartSLOduration=4.117148738 podStartE2EDuration="8.964063615s" podCreationTimestamp="2026-03-14 05:53:00 +0000 UTC" firstStartedPulling="2026-03-14 05:53:02.273617146 +0000 UTC m=+1565.361526446" lastFinishedPulling="2026-03-14 05:53:07.120532033 +0000 UTC m=+1570.208441323" observedRunningTime="2026-03-14 05:53:08.961901905 +0000 UTC m=+1572.049811205" watchObservedRunningTime="2026-03-14 05:53:08.964063615 +0000 UTC m=+1572.051972915" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.964156 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fps7b" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.964317 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerDied","Data":"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.964417 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fps7b" event={"ID":"8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6","Type":"ContainerDied","Data":"c2f23e3e289e71c05dee5dc0ff42cf32ee2482deeae970fc19f45cf4086e4ba4"} Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.964499 4713 scope.go:117] "RemoveContainer" containerID="8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a" Mar 14 05:53:08 crc kubenswrapper[4713]: I0314 05:53:08.966002 4713 scope.go:117] "RemoveContainer" containerID="203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4" Mar 14 05:53:08 crc kubenswrapper[4713]: E0314 05:53:08.966195 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-569ddb745b-tqpzz_openstack(6e045597-8d40-424a-8982-8dbfb1e379e3)\"" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.001512 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.045444 4713 scope.go:117] "RemoveContainer" containerID="f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.094921 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.865653947 podStartE2EDuration="8.094901592s" podCreationTimestamp="2026-03-14 05:53:01 +0000 UTC" firstStartedPulling="2026-03-14 05:53:02.840310357 +0000 UTC m=+1565.928219657" lastFinishedPulling="2026-03-14 05:53:04.069558002 +0000 UTC m=+1567.157467302" observedRunningTime="2026-03-14 05:53:08.98871273 +0000 UTC m=+1572.076622030" watchObservedRunningTime="2026-03-14 05:53:09.094901592 +0000 UTC m=+1572.182810892" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.178461 4713 scope.go:117] "RemoveContainer" containerID="7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.192377 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.260525 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fps7b"] Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.266503 4713 scope.go:117] "RemoveContainer" containerID="8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a" Mar 14 05:53:09 crc kubenswrapper[4713]: E0314 05:53:09.268364 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a\": container with ID starting with 8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a not found: ID does not exist" containerID="8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.268466 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a"} err="failed to get container status \"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a\": rpc error: code = NotFound desc = could not find container \"8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a\": container with ID starting with 8565f7d287208db2fadde82c487bd579bc790bd147af779c93ff468df3cf0d0a not found: ID does not exist" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.268556 4713 scope.go:117] "RemoveContainer" containerID="f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17" Mar 14 05:53:09 crc kubenswrapper[4713]: E0314 05:53:09.277508 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17\": container with ID starting with f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17 not found: ID does not exist" containerID="f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.277568 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17"} err="failed to get container status \"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17\": rpc error: code = NotFound desc = could not find container \"f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17\": container with ID starting with f016182314ead96ae511848176b862c98ebbda89e2ec500b15b5b97f523b3a17 not found: ID does not exist" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.277603 4713 scope.go:117] "RemoveContainer" containerID="7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce" Mar 14 05:53:09 crc kubenswrapper[4713]: E0314 05:53:09.278691 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce\": container with ID starting with 7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce not found: ID does not exist" containerID="7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.278725 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce"} err="failed to get container status \"7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce\": rpc error: code = NotFound desc = could not find container \"7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce\": container with ID starting with 7f85dc8b5219927e55fe90694eaf67017a57e2b01efac76017ebb13d367076ce not found: ID does not exist" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.502137 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66dc7cf97d-ll6gt"] Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.616778 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" path="/var/lib/kubelet/pods/8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6/volumes" Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.978439 4713 generic.go:334] "Generic (PLEG): container finished" podID="64bc4849-85d3-4043-bfae-18176e47b753" containerID="b5f8077200584a3eab19484274276e4e9756923899697e691e5672f787622409" exitCode=143 Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.978498 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerDied","Data":"b5f8077200584a3eab19484274276e4e9756923899697e691e5672f787622409"} Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.981427 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66dc7cf97d-ll6gt" event={"ID":"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68","Type":"ContainerStarted","Data":"76b83d37dc537b4184092dd74d504de64b80b40caec8f16ea2e65f9877eedece"} Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.981480 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66dc7cf97d-ll6gt" event={"ID":"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68","Type":"ContainerStarted","Data":"d8ee2aac5f26a609586ff1eb274b8947151386c2622b2a7d07d43bbd4403a976"} Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.986931 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerStarted","Data":"3e575b54cb1b5df96bdf0ab7ee4be403c8cb34f91c750616e5a05e623cb6f72f"} Mar 14 05:53:09 crc kubenswrapper[4713]: I0314 05:53:09.986983 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerStarted","Data":"c75a8381a926238c7b1e25e6c5959462a6e3ad9dc577286dd72727e414dae81d"} Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.565305 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.760715 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.761253 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-569ddb745b-tqpzz" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-api" containerID="cri-o://a748e2389e0bf3125b924d80cd2d7eba269ec4cff23f1d8a7715009735a60964" gracePeriod=30 Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.920280 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6854d4949-xljzd"] Mar 14 05:53:10 crc kubenswrapper[4713]: E0314 05:53:10.920848 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="extract-content" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.920866 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="extract-content" Mar 14 05:53:10 crc kubenswrapper[4713]: E0314 05:53:10.920893 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="extract-utilities" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.920899 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="extract-utilities" Mar 14 05:53:10 crc kubenswrapper[4713]: E0314 05:53:10.920928 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.920934 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.921174 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c20b670-fc3b-4cb5-b5eb-dd15e5bb9bf6" containerName="registry-server" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.922432 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:10 crc kubenswrapper[4713]: I0314 05:53:10.929961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854d4949-xljzd"] Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.009541 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66dc7cf97d-ll6gt" event={"ID":"c0226c41-0d23-4ea8-b8ff-0f1b20a04f68","Type":"ContainerStarted","Data":"115d283fc08944a555a7613cbac55e2331b7e69e2c10a680dab28b8b4dc66977"} Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.010814 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.010842 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.079619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.079673 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-public-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.079951 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-internal-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.080123 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-ovndb-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.080278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-httpd-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.080320 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-combined-ca-bundle\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.080383 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/2d93f914-fdbb-4acc-83f8-30effe510c7e-kube-api-access-bd5cm\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182007 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-internal-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182106 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-ovndb-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182155 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-httpd-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182176 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-combined-ca-bundle\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/2d93f914-fdbb-4acc-83f8-30effe510c7e-kube-api-access-bd5cm\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.182307 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-public-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.189642 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-internal-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.190003 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-httpd-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.191178 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-config\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.193024 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-public-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.200152 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-ovndb-tls-certs\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.207001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d93f914-fdbb-4acc-83f8-30effe510c7e-combined-ca-bundle\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.214598 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5cm\" (UniqueName: \"kubernetes.io/projected/2d93f914-fdbb-4acc-83f8-30effe510c7e-kube-api-access-bd5cm\") pod \"neutron-6854d4949-xljzd\" (UID: \"2d93f914-fdbb-4acc-83f8-30effe510c7e\") " pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:11 crc kubenswrapper[4713]: I0314 05:53:11.308935 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.028639 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerStarted","Data":"44da8a65554c739ca9899effe32764b7ae4be9a685379423a55bb6937395ec4b"} Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.120595 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66dc7cf97d-ll6gt" podStartSLOduration=5.120565267 podStartE2EDuration="5.120565267s" podCreationTimestamp="2026-03-14 05:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:11.040554996 +0000 UTC m=+1574.128464306" watchObservedRunningTime="2026-03-14 05:53:12.120565267 +0000 UTC m=+1575.208474577" Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.123662 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.127709 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854d4949-xljzd"] Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.168004 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.286624 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.287280 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="dnsmasq-dns" containerID="cri-o://e8df5c89ced32a4a9160e058153b13e74607fbd75974957690a8713fc8f32f04" gracePeriod=10 Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.586825 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 05:53:12 crc kubenswrapper[4713]: I0314 05:53:12.707013 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.199:5353: connect: connection refused" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.060910 4713 generic.go:334] "Generic (PLEG): container finished" podID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerID="e8df5c89ced32a4a9160e058153b13e74607fbd75974957690a8713fc8f32f04" exitCode=0 Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.061296 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" event={"ID":"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a","Type":"ContainerDied","Data":"e8df5c89ced32a4a9160e058153b13e74607fbd75974957690a8713fc8f32f04"} Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.086768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854d4949-xljzd" event={"ID":"2d93f914-fdbb-4acc-83f8-30effe510c7e","Type":"ContainerStarted","Data":"49f18b2b4e27fe978041e5d1e44b125c1fae5188a0cc9bafc79db042a96c2538"} Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.086830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854d4949-xljzd" event={"ID":"2d93f914-fdbb-4acc-83f8-30effe510c7e","Type":"ContainerStarted","Data":"b659fb8173875a14c0ca87bf75ea9ff6dc0874785deb1b567383846a3ce6f7ae"} Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.086844 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854d4949-xljzd" event={"ID":"2d93f914-fdbb-4acc-83f8-30effe510c7e","Type":"ContainerStarted","Data":"302d74c27effed6656aa042c8a5d881de9ce4b133f6cdbb7d4a282f1c893e316"} Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.087221 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.155922 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.163166 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.168455 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6854d4949-xljzd" podStartSLOduration=3.168435915 podStartE2EDuration="3.168435915s" podCreationTimestamp="2026-03-14 05:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:13.145952959 +0000 UTC m=+1576.233862279" watchObservedRunningTime="2026-03-14 05:53:13.168435915 +0000 UTC m=+1576.256345215" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.245469 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749pf\" (UniqueName: \"kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.245577 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.245668 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.245844 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.245898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.246041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc\") pod \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\" (UID: \"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a\") " Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.289295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf" (OuterVolumeSpecName: "kube-api-access-749pf") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "kube-api-access-749pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.347939 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.360865 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.360917 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-749pf\" (UniqueName: \"kubernetes.io/projected/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-kube-api-access-749pf\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.368735 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config" (OuterVolumeSpecName: "config") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.384691 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.413622 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.432708 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" (UID: "af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.462771 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.462813 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.462825 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:13 crc kubenswrapper[4713]: I0314 05:53:13.462833 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.096808 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerStarted","Data":"d7e456d420f4d54c21e675786f8716139e7ac9049bd419aa4530ed8894df66e6"} Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.097285 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.106878 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="cinder-scheduler" containerID="cri-o://906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" gracePeriod=30 Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.107167 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.108102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2b9cn" event={"ID":"af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a","Type":"ContainerDied","Data":"b39e465653c1b9d0444dcf6784e9eb52987bf418af5e58a7ca0dad9644b87285"} Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.108132 4713 scope.go:117] "RemoveContainer" containerID="e8df5c89ced32a4a9160e058153b13e74607fbd75974957690a8713fc8f32f04" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.108924 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="probe" containerID="cri-o://f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" gracePeriod=30 Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.135966 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.925196536 podStartE2EDuration="9.135944332s" podCreationTimestamp="2026-03-14 05:53:05 +0000 UTC" firstStartedPulling="2026-03-14 05:53:07.654276474 +0000 UTC m=+1570.742185774" lastFinishedPulling="2026-03-14 05:53:12.86502427 +0000 UTC m=+1575.952933570" observedRunningTime="2026-03-14 05:53:14.12205563 +0000 UTC m=+1577.209964940" watchObservedRunningTime="2026-03-14 05:53:14.135944332 +0000 UTC m=+1577.223853632" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.153363 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.159479 4713 scope.go:117] "RemoveContainer" containerID="d748fcddbd0e761755e65fcd606bdbe1ea6b82d32b30800d29233576b80e55ca" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.168024 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2b9cn"] Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.519800 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:14 crc kubenswrapper[4713]: I0314 05:53:14.620244 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:15 crc kubenswrapper[4713]: I0314 05:53:15.580447 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" path="/var/lib/kubelet/pods/af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a/volumes" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.087503 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.157993 4713 generic.go:334] "Generic (PLEG): container finished" podID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerID="f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" exitCode=0 Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158024 4713 generic.go:334] "Generic (PLEG): container finished" podID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerID="906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" exitCode=0 Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158068 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerDied","Data":"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec"} Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158075 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158094 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerDied","Data":"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b"} Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23c4d5b3-f9fc-4cba-9a5a-133348482f3d","Type":"ContainerDied","Data":"f6bf420b6f12e65cde96ec9a5604b21f9443890c28a8e3addcd0f472e8a02e07"} Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.158119 4713 scope.go:117] "RemoveContainer" containerID="f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.163429 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/2.log" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.164340 4713 generic.go:334] "Generic (PLEG): container finished" podID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerID="a748e2389e0bf3125b924d80cd2d7eba269ec4cff23f1d8a7715009735a60964" exitCode=0 Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.164380 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerDied","Data":"a748e2389e0bf3125b924d80cd2d7eba269ec4cff23f1d8a7715009735a60964"} Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.216549 4713 scope.go:117] "RemoveContainer" containerID="906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.240071 4713 scope.go:117] "RemoveContainer" containerID="f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.240598 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec\": container with ID starting with f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec not found: ID does not exist" containerID="f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.240636 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec"} err="failed to get container status \"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec\": rpc error: code = NotFound desc = could not find container \"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec\": container with ID starting with f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec not found: ID does not exist" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.240661 4713 scope.go:117] "RemoveContainer" containerID="906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.242608 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b\": container with ID starting with 906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b not found: ID does not exist" containerID="906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.242658 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b"} err="failed to get container status \"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b\": rpc error: code = NotFound desc = could not find container \"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b\": container with ID starting with 906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b not found: ID does not exist" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.242673 4713 scope.go:117] "RemoveContainer" containerID="f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.242937 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec"} err="failed to get container status \"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec\": rpc error: code = NotFound desc = could not find container \"f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec\": container with ID starting with f6017e1d1a55693011922887fb629647db9250e95e022b303ca30681c18589ec not found: ID does not exist" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.242954 4713 scope.go:117] "RemoveContainer" containerID="906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.244980 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b"} err="failed to get container status \"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b\": rpc error: code = NotFound desc = could not find container \"906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b\": container with ID starting with 906b84ebb52f8eefb531c008eef0265515ac91d7d2fc8b9306b72f150c2d201b not found: ID does not exist" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.249766 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.249825 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.249908 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.249991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.250259 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.250299 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qp8\" (UniqueName: \"kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8\") pod \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\" (UID: \"23c4d5b3-f9fc-4cba-9a5a-133348482f3d\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.250401 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.251306 4713 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.261756 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8" (OuterVolumeSpecName: "kube-api-access-46qp8") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "kube-api-access-46qp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.268083 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.281335 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts" (OuterVolumeSpecName: "scripts") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.325432 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.353143 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qp8\" (UniqueName: \"kubernetes.io/projected/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-kube-api-access-46qp8\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.353178 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.353187 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.353197 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.446396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data" (OuterVolumeSpecName: "config-data") pod "23c4d5b3-f9fc-4cba-9a5a-133348482f3d" (UID: "23c4d5b3-f9fc-4cba-9a5a-133348482f3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.455150 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c4d5b3-f9fc-4cba-9a5a-133348482f3d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.538867 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/2.log" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.539469 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.568546 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.579157 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.642922 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643577 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643596 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643620 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="probe" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643628 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="probe" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643651 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="cinder-scheduler" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643659 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="cinder-scheduler" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643683 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="init" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643692 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="init" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643704 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="dnsmasq-dns" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643712 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="dnsmasq-dns" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643724 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643732 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643749 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643757 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: E0314 05:53:16.643782 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-api" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.643790 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-api" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644046 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="probe" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644081 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644099 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" containerName="cinder-scheduler" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644108 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af53d9ae-bf57-44b5-b3b4-3f61d3b03a0a" containerName="dnsmasq-dns" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644120 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-api" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644131 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.644142 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" containerName="neutron-httpd" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.645856 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.649165 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.656435 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.666997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgnz\" (UniqueName: \"kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz\") pod \"6e045597-8d40-424a-8982-8dbfb1e379e3\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.667094 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs\") pod \"6e045597-8d40-424a-8982-8dbfb1e379e3\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.667125 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config\") pod \"6e045597-8d40-424a-8982-8dbfb1e379e3\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.667199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config\") pod \"6e045597-8d40-424a-8982-8dbfb1e379e3\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.667309 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle\") pod \"6e045597-8d40-424a-8982-8dbfb1e379e3\" (UID: \"6e045597-8d40-424a-8982-8dbfb1e379e3\") " Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.677397 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6e045597-8d40-424a-8982-8dbfb1e379e3" (UID: "6e045597-8d40-424a-8982-8dbfb1e379e3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.697766 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz" (OuterVolumeSpecName: "kube-api-access-vkgnz") pod "6e045597-8d40-424a-8982-8dbfb1e379e3" (UID: "6e045597-8d40-424a-8982-8dbfb1e379e3"). InnerVolumeSpecName "kube-api-access-vkgnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.753358 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config" (OuterVolumeSpecName: "config") pod "6e045597-8d40-424a-8982-8dbfb1e379e3" (UID: "6e045597-8d40-424a-8982-8dbfb1e379e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b61a43-5015-4b52-b55f-4ea941db9a0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773562 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773701 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773775 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqbdx\" (UniqueName: \"kubernetes.io/projected/f6b61a43-5015-4b52-b55f-4ea941db9a0d-kube-api-access-sqbdx\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773812 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773911 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773927 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgnz\" (UniqueName: \"kubernetes.io/projected/6e045597-8d40-424a-8982-8dbfb1e379e3-kube-api-access-vkgnz\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.773941 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.781301 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e045597-8d40-424a-8982-8dbfb1e379e3" (UID: "6e045597-8d40-424a-8982-8dbfb1e379e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.788330 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6e045597-8d40-424a-8982-8dbfb1e379e3" (UID: "6e045597-8d40-424a-8982-8dbfb1e379e3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.875609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.875925 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876077 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqbdx\" (UniqueName: \"kubernetes.io/projected/f6b61a43-5015-4b52-b55f-4ea941db9a0d-kube-api-access-sqbdx\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b61a43-5015-4b52-b55f-4ea941db9a0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876652 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b61a43-5015-4b52-b55f-4ea941db9a0d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876669 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.876968 4713 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.877065 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e045597-8d40-424a-8982-8dbfb1e379e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.880535 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.882021 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.882338 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.884049 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b61a43-5015-4b52-b55f-4ea941db9a0d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:16 crc kubenswrapper[4713]: I0314 05:53:16.897482 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqbdx\" (UniqueName: \"kubernetes.io/projected/f6b61a43-5015-4b52-b55f-4ea941db9a0d-kube-api-access-sqbdx\") pod \"cinder-scheduler-0\" (UID: \"f6b61a43-5015-4b52-b55f-4ea941db9a0d\") " pod="openstack/cinder-scheduler-0" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.036354 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.193169 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-569ddb745b-tqpzz_6e045597-8d40-424a-8982-8dbfb1e379e3/neutron-httpd/2.log" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.193887 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-569ddb745b-tqpzz" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.193954 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-569ddb745b-tqpzz" event={"ID":"6e045597-8d40-424a-8982-8dbfb1e379e3","Type":"ContainerDied","Data":"6e632b3ab7066094fd3339354e722a805d0c43cbb8db9d548dcf497f1eaed56c"} Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.194024 4713 scope.go:117] "RemoveContainer" containerID="203b7b9f2b4ab6267edb7d43d937c26933bb7ac10a4ddcd50556ce87ac84fbc4" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.234558 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.241388 4713 scope.go:117] "RemoveContainer" containerID="a748e2389e0bf3125b924d80cd2d7eba269ec4cff23f1d8a7715009735a60964" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.245084 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-569ddb745b-tqpzz"] Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.575455 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c4d5b3-f9fc-4cba-9a5a-133348482f3d" path="/var/lib/kubelet/pods/23c4d5b3-f9fc-4cba-9a5a-133348482f3d/volumes" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.576414 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e045597-8d40-424a-8982-8dbfb1e379e3" path="/var/lib/kubelet/pods/6e045597-8d40-424a-8982-8dbfb1e379e3/volumes" Mar 14 05:53:17 crc kubenswrapper[4713]: I0314 05:53:17.586412 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 05:53:18 crc kubenswrapper[4713]: I0314 05:53:18.214300 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b61a43-5015-4b52-b55f-4ea941db9a0d","Type":"ContainerStarted","Data":"bc71ec22ace11517a769c37910d01fbfff4e80576f897e1d6b3e499ebdffbc10"} Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.293136 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b61a43-5015-4b52-b55f-4ea941db9a0d","Type":"ContainerStarted","Data":"549819fc0aca7b042e506cb219f30aa31ecc5117c5843e20676e34ee6700e42e"} Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.596571 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.673902 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.907227 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59cfcdd844-lx8mr"] Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.909458 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:19 crc kubenswrapper[4713]: I0314 05:53:19.928593 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59cfcdd844-lx8mr"] Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060388 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-scripts\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060534 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-internal-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060574 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-combined-ca-bundle\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060604 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51065c42-7604-4da1-8119-395c8c1ace81-logs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060664 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-public-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060805 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-config-data\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.060835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dc65\" (UniqueName: \"kubernetes.io/projected/51065c42-7604-4da1-8119-395c8c1ace81-kube-api-access-4dc65\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163291 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51065c42-7604-4da1-8119-395c8c1ace81-logs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163368 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-public-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163470 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-config-data\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163491 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dc65\" (UniqueName: \"kubernetes.io/projected/51065c42-7604-4da1-8119-395c8c1ace81-kube-api-access-4dc65\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163587 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-scripts\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163639 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-internal-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.163661 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-combined-ca-bundle\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.164689 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51065c42-7604-4da1-8119-395c8c1ace81-logs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.170631 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-public-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.170678 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-internal-tls-certs\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.171504 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-scripts\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.174036 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-combined-ca-bundle\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.174044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51065c42-7604-4da1-8119-395c8c1ace81-config-data\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.202147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dc65\" (UniqueName: \"kubernetes.io/projected/51065c42-7604-4da1-8119-395c8c1ace81-kube-api-access-4dc65\") pod \"placement-59cfcdd844-lx8mr\" (UID: \"51065c42-7604-4da1-8119-395c8c1ace81\") " pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.231789 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.312456 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b61a43-5015-4b52-b55f-4ea941db9a0d","Type":"ContainerStarted","Data":"69a62892e0c5dab927214c55f7397b2b5c2a0be4e7af82de276bea22ae1f2121"} Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.357152 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.357134355 podStartE2EDuration="4.357134355s" podCreationTimestamp="2026-03-14 05:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:20.348566022 +0000 UTC m=+1583.436475322" watchObservedRunningTime="2026-03-14 05:53:20.357134355 +0000 UTC m=+1583.445043655" Mar 14 05:53:20 crc kubenswrapper[4713]: I0314 05:53:20.936630 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59cfcdd844-lx8mr"] Mar 14 05:53:21 crc kubenswrapper[4713]: I0314 05:53:21.321054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59cfcdd844-lx8mr" event={"ID":"51065c42-7604-4da1-8119-395c8c1ace81","Type":"ContainerStarted","Data":"2c6ec9b8c4d122ea50c3dac7e0a48216e0be6a400dca1f758f664556be543166"} Mar 14 05:53:21 crc kubenswrapper[4713]: I0314 05:53:21.812083 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.036565 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.338058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59cfcdd844-lx8mr" event={"ID":"51065c42-7604-4da1-8119-395c8c1ace81","Type":"ContainerStarted","Data":"405fce8e874ac2378cbfdc5c471262b9b3959a69844e1001d65d09415fd939a2"} Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.338105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59cfcdd844-lx8mr" event={"ID":"51065c42-7604-4da1-8119-395c8c1ace81","Type":"ContainerStarted","Data":"fcf6f07eb3bb13ce3040d9cefeb1f157b2ec1137cf641f4ab6744bc3381bcdd3"} Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.339582 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.339735 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:22 crc kubenswrapper[4713]: I0314 05:53:22.389994 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59cfcdd844-lx8mr" podStartSLOduration=3.389960066 podStartE2EDuration="3.389960066s" podCreationTimestamp="2026-03-14 05:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:22.371741565 +0000 UTC m=+1585.459650865" watchObservedRunningTime="2026-03-14 05:53:22.389960066 +0000 UTC m=+1585.477869366" Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.251929 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.386535 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69d88696fb-hfdtr" Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.455454 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66dc7cf97d-ll6gt" podUID="c0226c41-0d23-4ea8-b8ff-0f1b20a04f68" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.212:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.499239 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66dc7cf97d-ll6gt" Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.606251 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.606513 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54566f9956-h7rfh" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api-log" containerID="cri-o://41b1db5ad83e7c6047fa94535ac12eb197fad8feae25f04d65b343f000c6a301" gracePeriod=30 Mar 14 05:53:23 crc kubenswrapper[4713]: I0314 05:53:23.607084 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54566f9956-h7rfh" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api" containerID="cri-o://9c2e9acef78bdf0d82a2dca07b76f499e6bcb16689c3674ce8a2cfa79e361adc" gracePeriod=30 Mar 14 05:53:24 crc kubenswrapper[4713]: I0314 05:53:24.360248 4713 generic.go:334] "Generic (PLEG): container finished" podID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerID="41b1db5ad83e7c6047fa94535ac12eb197fad8feae25f04d65b343f000c6a301" exitCode=143 Mar 14 05:53:24 crc kubenswrapper[4713]: I0314 05:53:24.360331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerDied","Data":"41b1db5ad83e7c6047fa94535ac12eb197fad8feae25f04d65b343f000c6a301"} Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.060742 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.064086 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.075008 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.075187 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tzmfd" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.075264 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.087459 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.140033 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9qq\" (UniqueName: \"kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.140086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.140264 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.140410 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.241664 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.241794 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.241947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9qq\" (UniqueName: \"kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.241975 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.256347 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.256799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.256816 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.259743 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9qq\" (UniqueName: \"kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq\") pod \"openstackclient\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.365323 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.367620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.411092 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.438701 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.441458 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.460614 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.469335 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.469428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config-secret\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.469458 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-combined-ca-bundle\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.469633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4v8h\" (UniqueName: \"kubernetes.io/projected/869960ea-c2fe-4a61-8f70-2e7724af6426-kube-api-access-d4v8h\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.572474 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.572551 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config-secret\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.572581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-combined-ca-bundle\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.572737 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4v8h\" (UniqueName: \"kubernetes.io/projected/869960ea-c2fe-4a61-8f70-2e7724af6426-kube-api-access-d4v8h\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.574515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.588461 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-combined-ca-bundle\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.600491 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/869960ea-c2fe-4a61-8f70-2e7724af6426-openstack-config-secret\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.601859 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4v8h\" (UniqueName: \"kubernetes.io/projected/869960ea-c2fe-4a61-8f70-2e7724af6426-kube-api-access-d4v8h\") pod \"openstackclient\" (UID: \"869960ea-c2fe-4a61-8f70-2e7724af6426\") " pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: E0314 05:53:25.694824 4713 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 14 05:53:25 crc kubenswrapper[4713]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8de83d6c-eaf4-42a2-b607-c4e0dbecce90_0(05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343" Netns:"/var/run/netns/36f4db7d-f0ad-41c2-bc7b-0dc32f614167" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343;K8S_POD_UID=8de83d6c-eaf4-42a2-b607-c4e0dbecce90" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8de83d6c-eaf4-42a2-b607-c4e0dbecce90]: expected pod UID "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" but got "869960ea-c2fe-4a61-8f70-2e7724af6426" from Kube API Mar 14 05:53:25 crc kubenswrapper[4713]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 05:53:25 crc kubenswrapper[4713]: > Mar 14 05:53:25 crc kubenswrapper[4713]: E0314 05:53:25.695411 4713 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 14 05:53:25 crc kubenswrapper[4713]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8de83d6c-eaf4-42a2-b607-c4e0dbecce90_0(05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343" Netns:"/var/run/netns/36f4db7d-f0ad-41c2-bc7b-0dc32f614167" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=05d526c644b4df8ca89942265967965e309595f6deebe77494b03907842d1343;K8S_POD_UID=8de83d6c-eaf4-42a2-b607-c4e0dbecce90" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8de83d6c-eaf4-42a2-b607-c4e0dbecce90]: expected pod UID "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" but got "869960ea-c2fe-4a61-8f70-2e7724af6426" from Kube API Mar 14 05:53:25 crc kubenswrapper[4713]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 05:53:25 crc kubenswrapper[4713]: > pod="openstack/openstackclient" Mar 14 05:53:25 crc kubenswrapper[4713]: I0314 05:53:25.840864 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:26 crc kubenswrapper[4713]: I0314 05:53:26.383802 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:26 crc kubenswrapper[4713]: I0314 05:53:26.394879 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:26 crc kubenswrapper[4713]: I0314 05:53:26.402360 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8de83d6c-eaf4-42a2-b607-c4e0dbecce90" podUID="869960ea-c2fe-4a61-8f70-2e7724af6426" Mar 14 05:53:26 crc kubenswrapper[4713]: I0314 05:53:26.423937 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.494845 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle\") pod \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.494990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret\") pod \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.495059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config\") pod \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.495084 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9qq\" (UniqueName: \"kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq\") pod \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\" (UID: \"8de83d6c-eaf4-42a2-b607-c4e0dbecce90\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.496057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" (UID: "8de83d6c-eaf4-42a2-b607-c4e0dbecce90"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.502354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" (UID: "8de83d6c-eaf4-42a2-b607-c4e0dbecce90"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.508496 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq" (OuterVolumeSpecName: "kube-api-access-xc9qq") pod "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" (UID: "8de83d6c-eaf4-42a2-b607-c4e0dbecce90"). InnerVolumeSpecName "kube-api-access-xc9qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.514375 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de83d6c-eaf4-42a2-b607-c4e0dbecce90" (UID: "8de83d6c-eaf4-42a2-b607-c4e0dbecce90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.597572 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.597601 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.597610 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9qq\" (UniqueName: \"kubernetes.io/projected/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-kube-api-access-xc9qq\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.597618 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de83d6c-eaf4-42a2-b607-c4e0dbecce90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.840481 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54566f9956-h7rfh" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:40192->10.217.0.207:9311: read: connection reset by peer" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:26.842930 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54566f9956-h7rfh" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:40186->10.217.0.207:9311: read: connection reset by peer" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.347023 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.447054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"869960ea-c2fe-4a61-8f70-2e7724af6426","Type":"ContainerStarted","Data":"c1d046477e7bb35a90817c964c0a0900b311c5ab72df64386fc068f8f29cee02"} Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.458440 4713 generic.go:334] "Generic (PLEG): container finished" podID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerID="9c2e9acef78bdf0d82a2dca07b76f499e6bcb16689c3674ce8a2cfa79e361adc" exitCode=0 Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.458524 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.458611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerDied","Data":"9c2e9acef78bdf0d82a2dca07b76f499e6bcb16689c3674ce8a2cfa79e361adc"} Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.582339 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de83d6c-eaf4-42a2-b607-c4e0dbecce90" path="/var/lib/kubelet/pods/8de83d6c-eaf4-42a2-b607-c4e0dbecce90/volumes" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.612435 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.733019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data\") pod \"80a9d55e-79c2-4a43-af43-4c9213e93501\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.733180 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7vqq\" (UniqueName: \"kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq\") pod \"80a9d55e-79c2-4a43-af43-4c9213e93501\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.733261 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs\") pod \"80a9d55e-79c2-4a43-af43-4c9213e93501\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.733403 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle\") pod \"80a9d55e-79c2-4a43-af43-4c9213e93501\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.733429 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom\") pod \"80a9d55e-79c2-4a43-af43-4c9213e93501\" (UID: \"80a9d55e-79c2-4a43-af43-4c9213e93501\") " Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.734555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs" (OuterVolumeSpecName: "logs") pod "80a9d55e-79c2-4a43-af43-4c9213e93501" (UID: "80a9d55e-79c2-4a43-af43-4c9213e93501"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.742344 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "80a9d55e-79c2-4a43-af43-4c9213e93501" (UID: "80a9d55e-79c2-4a43-af43-4c9213e93501"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.750584 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq" (OuterVolumeSpecName: "kube-api-access-n7vqq") pod "80a9d55e-79c2-4a43-af43-4c9213e93501" (UID: "80a9d55e-79c2-4a43-af43-4c9213e93501"). InnerVolumeSpecName "kube-api-access-n7vqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.782057 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80a9d55e-79c2-4a43-af43-4c9213e93501" (UID: "80a9d55e-79c2-4a43-af43-4c9213e93501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.843544 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.843764 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.843776 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7vqq\" (UniqueName: \"kubernetes.io/projected/80a9d55e-79c2-4a43-af43-4c9213e93501-kube-api-access-n7vqq\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.843790 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80a9d55e-79c2-4a43-af43-4c9213e93501-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.874361 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data" (OuterVolumeSpecName: "config-data") pod "80a9d55e-79c2-4a43-af43-4c9213e93501" (UID: "80a9d55e-79c2-4a43-af43-4c9213e93501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:27 crc kubenswrapper[4713]: I0314 05:53:27.945496 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d55e-79c2-4a43-af43-4c9213e93501-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.477799 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54566f9956-h7rfh" event={"ID":"80a9d55e-79c2-4a43-af43-4c9213e93501","Type":"ContainerDied","Data":"2ebf301543321a9b6245b87a71b9695f25587242c9534fa50385d5a696c72972"} Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.477873 4713 scope.go:117] "RemoveContainer" containerID="9c2e9acef78bdf0d82a2dca07b76f499e6bcb16689c3674ce8a2cfa79e361adc" Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.478066 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54566f9956-h7rfh" Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.550255 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.571223 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54566f9956-h7rfh"] Mar 14 05:53:28 crc kubenswrapper[4713]: I0314 05:53:28.572363 4713 scope.go:117] "RemoveContainer" containerID="41b1db5ad83e7c6047fa94535ac12eb197fad8feae25f04d65b343f000c6a301" Mar 14 05:53:29 crc kubenswrapper[4713]: I0314 05:53:29.577057 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" path="/var/lib/kubelet/pods/80a9d55e-79c2-4a43-af43-4c9213e93501/volumes" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.123224 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75d59d8dc5-c57fx"] Mar 14 05:53:31 crc kubenswrapper[4713]: E0314 05:53:31.125075 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api-log" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.125197 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api-log" Mar 14 05:53:31 crc kubenswrapper[4713]: E0314 05:53:31.125297 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.125389 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.125772 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api-log" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.125871 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a9d55e-79c2-4a43-af43-4c9213e93501" containerName="barbican-api" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.127440 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.132786 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.133034 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.133226 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.169288 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75d59d8dc5-c57fx"] Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.231628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j565c\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-kube-api-access-j565c\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232386 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-run-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-internal-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232535 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-public-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-etc-swift\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232729 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-log-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232897 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-config-data\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.232959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-combined-ca-bundle\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335078 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-etc-swift\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-log-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-config-data\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335227 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-combined-ca-bundle\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335319 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j565c\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-kube-api-access-j565c\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335404 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-run-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335435 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-internal-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-public-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.335759 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-log-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.336066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-run-httpd\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.343315 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-combined-ca-bundle\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.343440 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-public-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.343552 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-etc-swift\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.343455 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-config-data\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.345233 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-internal-tls-certs\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.353447 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j565c\" (UniqueName: \"kubernetes.io/projected/c945cc41-0bca-48e1-97b9-d0fb8085e3ca-kube-api-access-j565c\") pod \"swift-proxy-75d59d8dc5-c57fx\" (UID: \"c945cc41-0bca-48e1-97b9-d0fb8085e3ca\") " pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:31 crc kubenswrapper[4713]: I0314 05:53:31.467767 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:32 crc kubenswrapper[4713]: I0314 05:53:32.169456 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75d59d8dc5-c57fx"] Mar 14 05:53:32 crc kubenswrapper[4713]: I0314 05:53:32.545295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d59d8dc5-c57fx" event={"ID":"c945cc41-0bca-48e1-97b9-d0fb8085e3ca","Type":"ContainerStarted","Data":"ed7b2bc98e1afe1e71bed85e1e3671f2866c651db1d7ac20825a68cb2d76a957"} Mar 14 05:53:33 crc kubenswrapper[4713]: I0314 05:53:33.558521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d59d8dc5-c57fx" event={"ID":"c945cc41-0bca-48e1-97b9-d0fb8085e3ca","Type":"ContainerStarted","Data":"ab6344f451e684e32b1cd0292c8d621ea8dd12c9510fe11bc1ff7eb5017238a3"} Mar 14 05:53:33 crc kubenswrapper[4713]: I0314 05:53:33.559007 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d59d8dc5-c57fx" event={"ID":"c945cc41-0bca-48e1-97b9-d0fb8085e3ca","Type":"ContainerStarted","Data":"86c3372c273d7c2c1fe269ef60a3a01e0559ea9a971245a326c67919f7654257"} Mar 14 05:53:33 crc kubenswrapper[4713]: I0314 05:53:33.559057 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:33 crc kubenswrapper[4713]: I0314 05:53:33.559079 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:33 crc kubenswrapper[4713]: I0314 05:53:33.597612 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75d59d8dc5-c57fx" podStartSLOduration=2.597590759 podStartE2EDuration="2.597590759s" podCreationTimestamp="2026-03-14 05:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:33.593084424 +0000 UTC m=+1596.680993744" watchObservedRunningTime="2026-03-14 05:53:33.597590759 +0000 UTC m=+1596.685500059" Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.198308 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.198548 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-central-agent" containerID="cri-o://c75a8381a926238c7b1e25e6c5959462a6e3ad9dc577286dd72727e414dae81d" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.199357 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-notification-agent" containerID="cri-o://3e575b54cb1b5df96bdf0ab7ee4be403c8cb34f91c750616e5a05e623cb6f72f" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.199363 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="sg-core" containerID="cri-o://44da8a65554c739ca9899effe32764b7ae4be9a685379423a55bb6937395ec4b" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.199526 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" containerID="cri-o://d7e456d420f4d54c21e675786f8716139e7ac9049bd419aa4530ed8894df66e6" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.232807 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.211:3000/\": EOF" Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.436110 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.443051 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-log" containerID="cri-o://57cee5ab4a51db1af616270bd1f42f4369a82ee3d254e06ea9e8c414b14e8509" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.443144 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-httpd" containerID="cri-o://fe9f1c4270f728c656634153b98deef2c3986d1086d3ed248f07da3063174eb1" gracePeriod=30 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.576343 4713 generic.go:334] "Generic (PLEG): container finished" podID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerID="d7e456d420f4d54c21e675786f8716139e7ac9049bd419aa4530ed8894df66e6" exitCode=0 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.576387 4713 generic.go:334] "Generic (PLEG): container finished" podID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerID="44da8a65554c739ca9899effe32764b7ae4be9a685379423a55bb6937395ec4b" exitCode=2 Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.577620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerDied","Data":"d7e456d420f4d54c21e675786f8716139e7ac9049bd419aa4530ed8894df66e6"} Mar 14 05:53:34 crc kubenswrapper[4713]: I0314 05:53:34.577658 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerDied","Data":"44da8a65554c739ca9899effe32764b7ae4be9a685379423a55bb6937395ec4b"} Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.291454 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.293299 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.298845 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.299141 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8tk8h" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.299366 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.312725 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.333631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.333693 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.333755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.334083 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsctc\" (UniqueName: \"kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.443799 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.464051 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.480124 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsctc\" (UniqueName: \"kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.481560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.481736 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.481905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.502676 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.506032 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.510719 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.511113 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.529395 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsctc\" (UniqueName: \"kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc\") pod \"heat-engine-599cd54c4b-t7gdc\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.639461 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.641224 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.641804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.648404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.649183 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.651267 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.654648 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.688974 4713 generic.go:334] "Generic (PLEG): container finished" podID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerID="57cee5ab4a51db1af616270bd1f42f4369a82ee3d254e06ea9e8c414b14e8509" exitCode=143 Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.689064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerDied","Data":"57cee5ab4a51db1af616270bd1f42f4369a82ee3d254e06ea9e8c414b14e8509"} Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.690539 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.690817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.690893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.690988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.691068 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmt9\" (UniqueName: \"kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.691298 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.693812 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.717161 4713 generic.go:334] "Generic (PLEG): container finished" podID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerID="3e575b54cb1b5df96bdf0ab7ee4be403c8cb34f91c750616e5a05e623cb6f72f" exitCode=0 Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.717517 4713 generic.go:334] "Generic (PLEG): container finished" podID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerID="c75a8381a926238c7b1e25e6c5959462a6e3ad9dc577286dd72727e414dae81d" exitCode=0 Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.717367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerDied","Data":"3e575b54cb1b5df96bdf0ab7ee4be403c8cb34f91c750616e5a05e623cb6f72f"} Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.717567 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerDied","Data":"c75a8381a926238c7b1e25e6c5959462a6e3ad9dc577286dd72727e414dae81d"} Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.728474 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.794884 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.794997 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6fn\" (UniqueName: \"kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795037 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbwh\" (UniqueName: \"kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795231 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795403 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795487 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795666 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795811 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795913 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.795958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmt9\" (UniqueName: \"kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.796078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.796330 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.796916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.797592 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.797837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.815820 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmt9\" (UniqueName: \"kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9\") pod \"dnsmasq-dns-688b9f5b49-5cxjw\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.897770 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.897842 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.897933 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.898015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.898086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6fn\" (UniqueName: \"kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.898127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.898233 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbwh\" (UniqueName: \"kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.898289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.900662 4713 scope.go:117] "RemoveContainer" containerID="05f9987dc802058b1c3afcda5115b37ce74c083a3ec3bed5fcb50c7473f8a311" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.903027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.911031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.912029 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.919076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.925275 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.942331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.943052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6fn\" (UniqueName: \"kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn\") pod \"heat-cfnapi-684cc7695b-tnj9p\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.943329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.960127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbwh\" (UniqueName: \"kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh\") pod \"heat-api-6cffcd59cb-mklc8\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:35 crc kubenswrapper[4713]: I0314 05:53:35.998783 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:36 crc kubenswrapper[4713]: I0314 05:53:36.016616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:36 crc kubenswrapper[4713]: I0314 05:53:36.353681 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.211:3000/\": dial tcp 10.217.0.211:3000: connect: connection refused" Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.668522 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.196:9292/healthcheck\": dial tcp 10.217.0.196:9292: connect: connection refused" Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.669101 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.196:9292/healthcheck\": dial tcp 10.217.0.196:9292: connect: connection refused" Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.677445 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.677713 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-log" containerID="cri-o://9d1a343a53ec0ab52b00862fd9c26a72290f2ed8da4865f4e2b714d30b83a19e" gracePeriod=30 Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.677847 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-httpd" containerID="cri-o://c9698ecce9eddc0bcd717f7b57cf34fdd3e3de66254ab15ddef07410622a4221" gracePeriod=30 Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.752439 4713 generic.go:334] "Generic (PLEG): container finished" podID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerID="fe9f1c4270f728c656634153b98deef2c3986d1086d3ed248f07da3063174eb1" exitCode=0 Mar 14 05:53:37 crc kubenswrapper[4713]: I0314 05:53:37.752490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerDied","Data":"fe9f1c4270f728c656634153b98deef2c3986d1086d3ed248f07da3063174eb1"} Mar 14 05:53:38 crc kubenswrapper[4713]: I0314 05:53:38.775717 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerID="9d1a343a53ec0ab52b00862fd9c26a72290f2ed8da4865f4e2b714d30b83a19e" exitCode=143 Mar 14 05:53:38 crc kubenswrapper[4713]: I0314 05:53:38.776002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerDied","Data":"9d1a343a53ec0ab52b00862fd9c26a72290f2ed8da4865f4e2b714d30b83a19e"} Mar 14 05:53:39 crc kubenswrapper[4713]: I0314 05:53:39.795396 4713 generic.go:334] "Generic (PLEG): container finished" podID="64bc4849-85d3-4043-bfae-18176e47b753" containerID="994ee36d06ca632ff30863cbd9a2aa7b624d5f4c65a4267928cdab75b1474f56" exitCode=137 Mar 14 05:53:39 crc kubenswrapper[4713]: I0314 05:53:39.795473 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerDied","Data":"994ee36d06ca632ff30863cbd9a2aa7b624d5f4c65a4267928cdab75b1474f56"} Mar 14 05:53:39 crc kubenswrapper[4713]: I0314 05:53:39.978159 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vgcgm"] Mar 14 05:53:39 crc kubenswrapper[4713]: I0314 05:53:39.981522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:39 crc kubenswrapper[4713]: I0314 05:53:39.998644 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vgcgm"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.094793 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5n98m"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.096537 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.099940 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zvm\" (UniqueName: \"kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.100060 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.116053 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5n98m"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.203391 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfr4\" (UniqueName: \"kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.203546 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zvm\" (UniqueName: \"kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.203575 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.203624 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.204411 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.205830 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-782d-account-create-update-pglqr"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.207527 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.213136 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.233190 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-782d-account-create-update-pglqr"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.283028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zvm\" (UniqueName: \"kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm\") pod \"nova-api-db-create-vgcgm\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.316579 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d28jb"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.318897 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.322911 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.353969 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.373491 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.355173 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.361093 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d28jb"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.373976 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfr4\" (UniqueName: \"kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.374045 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhzw\" (UniqueName: \"kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.429188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfr4\" (UniqueName: \"kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4\") pod \"nova-cell0-db-create-5n98m\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.444109 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.476090 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhzw\" (UniqueName: \"kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.476252 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.476313 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.476357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbbqh\" (UniqueName: \"kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.477270 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.488022 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9002-account-create-update-mlmq6"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.489823 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.500568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.511681 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9002-account-create-update-mlmq6"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.524071 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhzw\" (UniqueName: \"kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw\") pod \"nova-api-782d-account-create-update-pglqr\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.562100 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.581813 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.582058 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjs9p\" (UniqueName: \"kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.583072 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.585403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbbqh\" (UniqueName: \"kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.585244 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.629944 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbbqh\" (UniqueName: \"kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh\") pod \"nova-cell1-db-create-d28jb\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.673306 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bcce-account-create-update-ftbtx"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.675349 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.684690 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.688818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.688906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.688948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjs9p\" (UniqueName: \"kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.689143 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zlc6\" (UniqueName: \"kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.690042 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.694218 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bcce-account-create-update-ftbtx"] Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.700658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.721841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjs9p\" (UniqueName: \"kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p\") pod \"nova-cell0-9002-account-create-update-mlmq6\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.793857 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zlc6\" (UniqueName: \"kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.794108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.795332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.813867 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zlc6\" (UniqueName: \"kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6\") pod \"nova-cell1-bcce-account-create-update-ftbtx\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:40 crc kubenswrapper[4713]: I0314 05:53:40.900007 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.008020 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.332672 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6854d4949-xljzd" Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.419263 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.419691 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c66c75585-lmbx8" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-api" containerID="cri-o://c1adca9c77949e566b093cb78b119a75d1add017dcbc51a6d10bd6e0f0845a63" gracePeriod=30 Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.420277 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c66c75585-lmbx8" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-httpd" containerID="cri-o://23cd3874d58987b3193d2239efbf9fb030e08b5be4f155204c4c7fc9692edcbd" gracePeriod=30 Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.479060 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.479145 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75d59d8dc5-c57fx" Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.848297 4713 generic.go:334] "Generic (PLEG): container finished" podID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerID="c9698ecce9eddc0bcd717f7b57cf34fdd3e3de66254ab15ddef07410622a4221" exitCode=0 Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.848622 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerDied","Data":"c9698ecce9eddc0bcd717f7b57cf34fdd3e3de66254ab15ddef07410622a4221"} Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.855193 4713 generic.go:334] "Generic (PLEG): container finished" podID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerID="23cd3874d58987b3193d2239efbf9fb030e08b5be4f155204c4c7fc9692edcbd" exitCode=0 Mar 14 05:53:41 crc kubenswrapper[4713]: I0314 05:53:41.855254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerDied","Data":"23cd3874d58987b3193d2239efbf9fb030e08b5be4f155204c4c7fc9692edcbd"} Mar 14 05:53:42 crc kubenswrapper[4713]: I0314 05:53:42.281965 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.210:8776/healthcheck\": dial tcp 10.217.0.210:8776: connect: connection refused" Mar 14 05:53:42 crc kubenswrapper[4713]: E0314 05:53:42.866153 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Mar 14 05:53:42 crc kubenswrapper[4713]: E0314 05:53:42.866361 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n87hffh79h5c4h57bh685h75hd6hb7h8dh84h656h564h56bhbdhd9h67ch7ch6h595h4h57hd4h57fh597h74h654h58hbbh556hcbh9bq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4v8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(869960ea-c2fe-4a61-8f70-2e7724af6426): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:53:42 crc kubenswrapper[4713]: E0314 05:53:42.871313 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="869960ea-c2fe-4a61-8f70-2e7724af6426" Mar 14 05:53:44 crc kubenswrapper[4713]: E0314 05:53:44.009993 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="869960ea-c2fe-4a61-8f70-2e7724af6426" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.219688 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336453 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336719 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knd82\" (UniqueName: \"kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336777 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336799 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.336923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id\") pod \"64bc4849-85d3-4043-bfae-18176e47b753\" (UID: \"64bc4849-85d3-4043-bfae-18176e47b753\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.337605 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.367344 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs" (OuterVolumeSpecName: "logs") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.426533 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82" (OuterVolumeSpecName: "kube-api-access-knd82") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "kube-api-access-knd82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.439795 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knd82\" (UniqueName: \"kubernetes.io/projected/64bc4849-85d3-4043-bfae-18176e47b753-kube-api-access-knd82\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.439825 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64bc4849-85d3-4043-bfae-18176e47b753-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.439838 4713 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64bc4849-85d3-4043-bfae-18176e47b753-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.446985 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts" (OuterVolumeSpecName: "scripts") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.447004 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.465380 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:53:44 crc kubenswrapper[4713]: E0314 05:53:44.465930 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api-log" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.465943 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api-log" Mar 14 05:53:44 crc kubenswrapper[4713]: E0314 05:53:44.465998 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.466006 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.466225 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api-log" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.466255 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="64bc4849-85d3-4043-bfae-18176e47b753" containerName="cinder-api" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.467129 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.472836 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.493406 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.541417 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.543047 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545450 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545509 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545596 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545620 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q4bk\" (UniqueName: \"kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545739 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545751 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.545760 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.557723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data" (OuterVolumeSpecName: "config-data") pod "64bc4849-85d3-4043-bfae-18176e47b753" (UID: "64bc4849-85d3-4043-bfae-18176e47b753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.612286 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.614191 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647603 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwtt\" (UniqueName: \"kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647680 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647710 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647767 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647852 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dz7t\" (UniqueName: \"kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647913 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647937 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.647961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q4bk\" (UniqueName: \"kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.648021 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.648095 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64bc4849-85d3-4043-bfae-18176e47b753-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.652509 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.655306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.665004 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.678804 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.684561 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q4bk\" (UniqueName: \"kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk\") pod \"heat-engine-6979cc54d6-q8q5n\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.694420 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.739428 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.747618 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750124 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750437 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750799 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx78h\" (UniqueName: \"kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750843 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750862 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.750972 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd\") pod \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\" (UID: \"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.752433 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.752515 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.752818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwtt\" (UniqueName: \"kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.752955 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.753002 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.753065 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.753122 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dz7t\" (UniqueName: \"kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.753172 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.758319 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.759447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.763890 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.765881 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.767998 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.769829 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h" (OuterVolumeSpecName: "kube-api-access-rx78h") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "kube-api-access-rx78h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.772716 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.780054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts" (OuterVolumeSpecName: "scripts") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.782932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.783269 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dz7t\" (UniqueName: \"kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.784560 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data\") pod \"heat-api-b896f6bb4-lbjfr\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.793137 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bcce-account-create-update-ftbtx"] Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.799374 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwtt\" (UniqueName: \"kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt\") pod \"heat-cfnapi-5955cd59bf-ttt8j\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.811842 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.818916 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857557 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857595 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857878 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857899 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwkk\" (UniqueName: \"kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857915 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.857963 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858004 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858073 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858104 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858177 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858203 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7d6q\" (UniqueName: \"kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q\") pod \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\" (UID: \"2ae5595c-d4de-4db7-b410-d149afd0f6a1\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858266 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data\") pod \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\" (UID: \"b733ba7d-6fd3-430d-83ee-3d9f32bad251\") " Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858959 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858972 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx78h\" (UniqueName: \"kubernetes.io/projected/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-kube-api-access-rx78h\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858982 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.858991 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.862252 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs" (OuterVolumeSpecName: "logs") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.862695 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.862784 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.870536 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs" (OuterVolumeSpecName: "logs") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.891859 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q" (OuterVolumeSpecName: "kube-api-access-n7d6q") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "kube-api-access-n7d6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.896724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk" (OuterVolumeSpecName: "kube-api-access-npwkk") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "kube-api-access-npwkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.899325 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts" (OuterVolumeSpecName: "scripts") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.908420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts" (OuterVolumeSpecName: "scripts") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961544 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961570 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwkk\" (UniqueName: \"kubernetes.io/projected/b733ba7d-6fd3-430d-83ee-3d9f32bad251-kube-api-access-npwkk\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961581 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961589 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b733ba7d-6fd3-430d-83ee-3d9f32bad251-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961599 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961606 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7d6q\" (UniqueName: \"kubernetes.io/projected/2ae5595c-d4de-4db7-b410-d149afd0f6a1-kube-api-access-n7d6q\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961614 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.961622 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae5595c-d4de-4db7-b410-d149afd0f6a1-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.987867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034" (OuterVolumeSpecName: "glance") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.991788 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:44 crc kubenswrapper[4713]: I0314 05:53:44.994340 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.002129 4713 generic.go:334] "Generic (PLEG): container finished" podID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerID="c1adca9c77949e566b093cb78b119a75d1add017dcbc51a6d10bd6e0f0845a63" exitCode=0 Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.002221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerDied","Data":"c1adca9c77949e566b093cb78b119a75d1add017dcbc51a6d10bd6e0f0845a63"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.010086 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b","Type":"ContainerDied","Data":"66a805e04b9b24f6b6f7905ca2ea6d10bcdf340b1090eed2e0c5e0592b998009"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.010138 4713 scope.go:117] "RemoveContainer" containerID="d7e456d420f4d54c21e675786f8716139e7ac9049bd419aa4530ed8894df66e6" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.010301 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.034650 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ae5595c-d4de-4db7-b410-d149afd0f6a1","Type":"ContainerDied","Data":"687d638a944b8d5b372d1278252d0086c05e63ed0da68091ce9e6d4cf45a3c99"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.034781 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.040531 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" event={"ID":"da0a922f-47a8-482f-b2e5-b9fb6176c221","Type":"ContainerStarted","Data":"303d60c524ab6d91e414751fba0c977cf186f9d3b5c9a4d7074f653b70d68f58"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.046762 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64bc4849-85d3-4043-bfae-18176e47b753","Type":"ContainerDied","Data":"91aa6d2909fec99c84c4297c6c8fa7e572ae38bb71b0bb7d6e0ee98683c6de63"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.046879 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.052422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b733ba7d-6fd3-430d-83ee-3d9f32bad251","Type":"ContainerDied","Data":"763c34625f57e59928cadd50fc88f976b057c9e843dd776c45f095e3d171f2f3"} Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.052522 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.057112 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810" (OuterVolumeSpecName: "glance") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.063798 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.063844 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") on node \"crc\" " Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.063857 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.063872 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") on node \"crc\" " Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.068429 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.113330 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.154732 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.155440 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.155566 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810") on node "crc" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.182177 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.191037 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.191675 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.191707 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.191732 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.191748 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.192352 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.192485 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034") on node "crc" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.197813 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data" (OuterVolumeSpecName: "config-data") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.232487 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data" (OuterVolumeSpecName: "config-data") pod "b733ba7d-6fd3-430d-83ee-3d9f32bad251" (UID: "b733ba7d-6fd3-430d-83ee-3d9f32bad251"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.337113 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b733ba7d-6fd3-430d-83ee-3d9f32bad251-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.339321 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data" (OuterVolumeSpecName: "config-data") pod "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" (UID: "3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.345757 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ae5595c-d4de-4db7-b410-d149afd0f6a1" (UID: "2ae5595c-d4de-4db7-b410-d149afd0f6a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.359731 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.360395 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.432603 4713 scope.go:117] "RemoveContainer" containerID="44da8a65554c739ca9899effe32764b7ae4be9a685379423a55bb6937395ec4b" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.447511 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.463475 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae5595c-d4de-4db7-b410-d149afd0f6a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.463504 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.469400 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.535135 4713 scope.go:117] "RemoveContainer" containerID="3e575b54cb1b5df96bdf0ab7ee4be403c8cb34f91c750616e5a05e623cb6f72f" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.560969 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.561882 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.561981 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562050 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-notification-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562147 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-notification-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562245 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562309 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562376 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562436 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562517 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="sg-core" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562580 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="sg-core" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562680 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-central-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562772 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-central-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562843 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.562905 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: E0314 05:53:45.562976 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563037 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563560 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563655 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="sg-core" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563731 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-notification-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563806 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563871 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-log" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.563942 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" containerName="glance-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.564023 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="ceilometer-central-agent" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.564099 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" containerName="proxy-httpd" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.566047 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.586083 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.586415 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.587052 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.644066 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64bc4849-85d3-4043-bfae-18176e47b753" path="/var/lib/kubelet/pods/64bc4849-85d3-4043-bfae-18176e47b753/volumes" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.645393 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.645425 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.645438 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.656100 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.659251 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.662593 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675119 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b88956-89b2-49f5-881a-f757d005ee2a-logs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675303 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675325 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxvv\" (UniqueName: \"kubernetes.io/projected/d1b88956-89b2-49f5-881a-f757d005ee2a-kube-api-access-mjxvv\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675469 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675490 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-scripts\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675587 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1b88956-89b2-49f5-881a-f757d005ee2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.675620 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.685440 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kknmp" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.686000 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.686116 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.690290 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.699187 4713 scope.go:117] "RemoveContainer" containerID="c75a8381a926238c7b1e25e6c5959462a6e3ad9dc577286dd72727e414dae81d" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.722753 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.777879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.778914 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-logs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.778964 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779015 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b88956-89b2-49f5-881a-f757d005ee2a-logs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779059 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779181 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-scripts\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779237 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brc6\" (UniqueName: \"kubernetes.io/projected/5171aada-64eb-4788-8446-346549791051-kube-api-access-7brc6\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779322 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjxvv\" (UniqueName: \"kubernetes.io/projected/d1b88956-89b2-49f5-881a-f757d005ee2a-kube-api-access-mjxvv\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779393 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-scripts\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779421 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779534 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-config-data\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779603 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1b88956-89b2-49f5-881a-f757d005ee2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.779626 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.790622 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1b88956-89b2-49f5-881a-f757d005ee2a-logs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.790713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1b88956-89b2-49f5-881a-f757d005ee2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.793688 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.801567 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.809523 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-scripts\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.816746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.816793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.817139 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b88956-89b2-49f5-881a-f757d005ee2a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.818236 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.830915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjxvv\" (UniqueName: \"kubernetes.io/projected/d1b88956-89b2-49f5-881a-f757d005ee2a-kube-api-access-mjxvv\") pod \"cinder-api-0\" (UID: \"d1b88956-89b2-49f5-881a-f757d005ee2a\") " pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.866519 4713 scope.go:117] "RemoveContainer" containerID="c9698ecce9eddc0bcd717f7b57cf34fdd3e3de66254ab15ddef07410622a4221" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.885642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.885761 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-scripts\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.885809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brc6\" (UniqueName: \"kubernetes.io/projected/5171aada-64eb-4788-8446-346549791051-kube-api-access-7brc6\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.885842 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.886010 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.886040 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-config-data\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.886080 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.886160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-logs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.886908 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-logs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.890403 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.892650 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5171aada-64eb-4788-8446-346549791051-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.895613 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-config-data\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.902181 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.936527 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.949229 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.953091 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5171aada-64eb-4788-8446-346549791051-scripts\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.953117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brc6\" (UniqueName: \"kubernetes.io/projected/5171aada-64eb-4788-8446-346549791051-kube-api-access-7brc6\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.953253 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.961761 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.961993 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.992249 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:53:45 crc kubenswrapper[4713]: I0314 05:53:45.992296 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/364e1a6e25afa18dd00146c1f7173dd96ffacfd7742d763c37150bde38ea6657/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.092495 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.095508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096413 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096494 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096584 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4z2\" (UniqueName: \"kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.096840 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.121512 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4183adc1-bf8b-4455-87d0-d1fb9a824810\") pod \"glance-default-external-api-0\" (UID: \"5171aada-64eb-4788-8446-346549791051\") " pod="openstack/glance-default-external-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.139457 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.148521 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" event={"ID":"da0a922f-47a8-482f-b2e5-b9fb6176c221","Type":"ContainerStarted","Data":"de8508c0f617dfad9f581e264b3e013a501c8184dfcce239f4c95713cab8d3cc"} Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.157196 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.162330 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.163052 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c66c75585-lmbx8" event={"ID":"00ca0cb1-0837-4538-ad90-a6425a10e037","Type":"ContainerDied","Data":"b848052252f303b975bacdff3e566608159b40824a0864575145493205b03151"} Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.173481 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:46 crc kubenswrapper[4713]: E0314 05:53:46.174153 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-api" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.174174 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-api" Mar 14 05:53:46 crc kubenswrapper[4713]: E0314 05:53:46.174220 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-httpd" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.174229 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-httpd" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.174522 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-api" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.174549 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" containerName="neutron-httpd" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.176347 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.182726 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.182930 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.191484 4713 scope.go:117] "RemoveContainer" containerID="9d1a343a53ec0ab52b00862fd9c26a72290f2ed8da4865f4e2b714d30b83a19e" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.196253 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200834 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4z2\" (UniqueName: \"kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200851 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200921 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200955 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.200997 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.202543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.203759 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.215168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.217112 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.218197 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" podStartSLOduration=6.218175388 podStartE2EDuration="6.218175388s" podCreationTimestamp="2026-03-14 05:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:46.182198961 +0000 UTC m=+1609.270108261" watchObservedRunningTime="2026-03-14 05:53:46.218175388 +0000 UTC m=+1609.306084688" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.232400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4z2\" (UniqueName: \"kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.232857 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.234466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.302708 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.302764 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxtn\" (UniqueName: \"kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.302836 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.302903 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.303049 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.303137 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.303169 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs\") pod \"00ca0cb1-0837-4538-ad90-a6425a10e037\" (UID: \"00ca0cb1-0837-4538-ad90-a6425a10e037\") " Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.307493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n89j\" (UniqueName: \"kubernetes.io/projected/b6b4ad11-424d-4394-b809-9fb4e559e255-kube-api-access-7n89j\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.339768 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.339836 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.339959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.340278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.340333 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.340406 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.340478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.342239 4713 scope.go:117] "RemoveContainer" containerID="994ee36d06ca632ff30863cbd9a2aa7b624d5f4c65a4267928cdab75b1474f56" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.377570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn" (OuterVolumeSpecName: "kube-api-access-mhxtn") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "kube-api-access-mhxtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.381481 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vgcgm"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.390198 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.428497 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.444992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.445045 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.445120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.446807 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.446871 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.446931 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.447001 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.447108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n89j\" (UniqueName: \"kubernetes.io/projected/b6b4ad11-424d-4394-b809-9fb4e559e255-kube-api-access-7n89j\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.447373 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxtn\" (UniqueName: \"kubernetes.io/projected/00ca0cb1-0837-4538-ad90-a6425a10e037-kube-api-access-mhxtn\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.447389 4713 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.448076 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.448430 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6b4ad11-424d-4394-b809-9fb4e559e255-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.450919 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.453084 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.453145 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/09d437df07f0cc980684bdc1a6436f63ebf1e68a215d57555012df4017c88ddd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.458018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.484674 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.487790 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.488466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.490532 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.496310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b4ad11-424d-4394-b809-9fb4e559e255-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.499907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n89j\" (UniqueName: \"kubernetes.io/projected/b6b4ad11-424d-4394-b809-9fb4e559e255-kube-api-access-7n89j\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.563435 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.587672 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.640677 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.653319 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.653501 4713 scope.go:117] "RemoveContainer" containerID="b5f8077200584a3eab19484274276e4e9756923899697e691e5672f787622409" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.658046 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.666093 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.669770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f3b4e7b-64d5-449b-b66e-9a5aa6dad034\") pod \"glance-default-internal-api-0\" (UID: \"b6b4ad11-424d-4394-b809-9fb4e559e255\") " pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.671815 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9002-account-create-update-mlmq6"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.688804 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config" (OuterVolumeSpecName: "config") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.694726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.699674 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5n98m"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.714540 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-782d-account-create-update-pglqr"] Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.715784 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "00ca0cb1-0837-4538-ad90-a6425a10e037" (UID: "00ca0cb1-0837-4538-ad90-a6425a10e037"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.725262 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d28jb"] Mar 14 05:53:46 crc kubenswrapper[4713]: W0314 05:53:46.754856 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaa23366_9513_4a8e_af1f_1b6b7596a5ac.slice/crio-76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31 WatchSource:0}: Error finding container 76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31: Status 404 returned error can't find the container with id 76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31 Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.756422 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.756449 4713 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.756461 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.756472 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00ca0cb1-0837-4538-ad90-a6425a10e037-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:46 crc kubenswrapper[4713]: W0314 05:53:46.804979 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b63ac3_87f8_43e2_b546_0cd9d6025e7e.slice/crio-b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554 WatchSource:0}: Error finding container b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554: Status 404 returned error can't find the container with id b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554 Mar 14 05:53:46 crc kubenswrapper[4713]: I0314 05:53:46.805709 4713 scope.go:117] "RemoveContainer" containerID="fe9f1c4270f728c656634153b98deef2c3986d1086d3ed248f07da3063174eb1" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.023314 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.066686 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.086673 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.097853 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 05:53:47 crc kubenswrapper[4713]: W0314 05:53:47.122861 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod473ca677_e1ad_431f_9f9e_c8260e43cda2.slice/crio-f2cf62a638222b53d1705e3cd94f3adf3ac38dd5262fad49a3edda7405c22f30 WatchSource:0}: Error finding container f2cf62a638222b53d1705e3cd94f3adf3ac38dd5262fad49a3edda7405c22f30: Status 404 returned error can't find the container with id f2cf62a638222b53d1705e3cd94f3adf3ac38dd5262fad49a3edda7405c22f30 Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.220159 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerStarted","Data":"ba1a97434cb28f52cbba9ead8ac9eb3db263bbd189769e32f0210987e5c0748d"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.249663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cffcd59cb-mklc8" event={"ID":"d0f3b8e7-bac4-47da-8b3f-c18d437789e4","Type":"ContainerStarted","Data":"6af596e50714b2143f24e0c81a3b7cb8c1e1ca0c2dafcda983590ffb9ae3b0e6"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.260348 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" event={"ID":"25be920e-162b-4f60-851b-228167576b04","Type":"ContainerStarted","Data":"947da072b050f303f0b2f319ea15237e8b31f580b2d4e40739eb98496f7eaa32"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.263885 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" event={"ID":"473ca677-e1ad-431f-9f9e-c8260e43cda2","Type":"ContainerStarted","Data":"f2cf62a638222b53d1705e3cd94f3adf3ac38dd5262fad49a3edda7405c22f30"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.291605 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-599cd54c4b-t7gdc" event={"ID":"f1d1368b-65d8-43fa-8025-0c41a7d0dd14","Type":"ContainerStarted","Data":"94cd47374e3af312227561edd10dda6fb06d67ac93c8e4acea0005231c3e470b"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.313724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vgcgm" event={"ID":"bd9da5a8-6d6b-404f-9cb7-3030364e35e2","Type":"ContainerStarted","Data":"3eede5e670de2d5a3dd4e1dcf1db98befac79c13e1dd6156de9f63432c9f9176"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.313768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vgcgm" event={"ID":"bd9da5a8-6d6b-404f-9cb7-3030364e35e2","Type":"ContainerStarted","Data":"c40c993be315d7a04b85b9057dd079afc42339f81f3ceb116606b51c83a95b16"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.355024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6979cc54d6-q8q5n" event={"ID":"0f319ea1-f399-41ba-81cd-edccb9905c98","Type":"ContainerStarted","Data":"841926dee436c6b507d7b1eab5d1c68eef782517c0305ef4874e7740c6b57873"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.367999 4713 scope.go:117] "RemoveContainer" containerID="57cee5ab4a51db1af616270bd1f42f4369a82ee3d254e06ea9e8c414b14e8509" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.391738 4713 generic.go:334] "Generic (PLEG): container finished" podID="da0a922f-47a8-482f-b2e5-b9fb6176c221" containerID="de8508c0f617dfad9f581e264b3e013a501c8184dfcce239f4c95713cab8d3cc" exitCode=0 Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.392228 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" event={"ID":"da0a922f-47a8-482f-b2e5-b9fb6176c221","Type":"ContainerDied","Data":"de8508c0f617dfad9f581e264b3e013a501c8184dfcce239f4c95713cab8d3cc"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.401378 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" event={"ID":"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c","Type":"ContainerStarted","Data":"1b46036c4f335b3c4ea4e02cf16db9e96243c7f83e8edd2b5703a07629506cd5"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.428765 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d1b88956-89b2-49f5-881a-f757d005ee2a","Type":"ContainerStarted","Data":"f3ad9a19e75c3382f22be10710ab10053475bf4f1f305ef0ed8de6ee02c98c63"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.430250 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d28jb" event={"ID":"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e","Type":"ContainerStarted","Data":"b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.445649 4713 scope.go:117] "RemoveContainer" containerID="23cd3874d58987b3193d2239efbf9fb030e08b5be4f155204c4c7fc9692edcbd" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.449490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5n98m" event={"ID":"bec0abd6-d181-493f-a285-932a17fac41d","Type":"ContainerStarted","Data":"36c17a373ae7f4dd246052dca70cbe35f38c776c6b33bc7558e63f476526db34"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.464482 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-782d-account-create-update-pglqr" event={"ID":"faa23366-9513-4a8e-af1f-1b6b7596a5ac","Type":"ContainerStarted","Data":"76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.468808 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c66c75585-lmbx8" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.481882 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" event={"ID":"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3","Type":"ContainerStarted","Data":"ef0833e1e4a75cc7baf3bc0300fd64574cc078f31a1beae868833c6bf62349b5"} Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.499491 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.529506 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.543446 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c66c75585-lmbx8"] Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.554092 4713 scope.go:117] "RemoveContainer" containerID="c1adca9c77949e566b093cb78b119a75d1add017dcbc51a6d10bd6e0f0845a63" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.603565 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ca0cb1-0837-4538-ad90-a6425a10e037" path="/var/lib/kubelet/pods/00ca0cb1-0837-4538-ad90-a6425a10e037/volumes" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.606471 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae5595c-d4de-4db7-b410-d149afd0f6a1" path="/var/lib/kubelet/pods/2ae5595c-d4de-4db7-b410-d149afd0f6a1/volumes" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.607311 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b" path="/var/lib/kubelet/pods/3d19e2d1-7c0e-4199-ac1d-c1cb9fc56d3b/volumes" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.609323 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b733ba7d-6fd3-430d-83ee-3d9f32bad251" path="/var/lib/kubelet/pods/b733ba7d-6fd3-430d-83ee-3d9f32bad251/volumes" Mar 14 05:53:47 crc kubenswrapper[4713]: I0314 05:53:47.815387 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:53:47 crc kubenswrapper[4713]: W0314 05:53:47.896526 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17433094_f4d7_4059_a74a_80ee3dff9981.slice/crio-7f674ed356ffe7f3da5f77e31d004959e7db65473f0ec1126f53966463fa7f2b WatchSource:0}: Error finding container 7f674ed356ffe7f3da5f77e31d004959e7db65473f0ec1126f53966463fa7f2b: Status 404 returned error can't find the container with id 7f674ed356ffe7f3da5f77e31d004959e7db65473f0ec1126f53966463fa7f2b Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.103164 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 05:53:48 crc kubenswrapper[4713]: E0314 05:53:48.369515 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbec0abd6_d181_493f_a285_932a17fac41d.slice/crio-conmon-317be9ba6837e17cd3e2c8e35046a0dc9dd0dd7e55036a9016f3e2b8d607e836.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.503562 4713 generic.go:334] "Generic (PLEG): container finished" podID="25be920e-162b-4f60-851b-228167576b04" containerID="a3d2d61e8bf96de324b5402aad0561de0c188360a65a4014eed190ff7dbc5051" exitCode=0 Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.503944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" event={"ID":"25be920e-162b-4f60-851b-228167576b04","Type":"ContainerDied","Data":"a3d2d61e8bf96de324b5402aad0561de0c188360a65a4014eed190ff7dbc5051"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.525550 4713 generic.go:334] "Generic (PLEG): container finished" podID="bd9da5a8-6d6b-404f-9cb7-3030364e35e2" containerID="3eede5e670de2d5a3dd4e1dcf1db98befac79c13e1dd6156de9f63432c9f9176" exitCode=0 Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.525650 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vgcgm" event={"ID":"bd9da5a8-6d6b-404f-9cb7-3030364e35e2","Type":"ContainerDied","Data":"3eede5e670de2d5a3dd4e1dcf1db98befac79c13e1dd6156de9f63432c9f9176"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.531001 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerStarted","Data":"7f674ed356ffe7f3da5f77e31d004959e7db65473f0ec1126f53966463fa7f2b"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.560717 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6979cc54d6-q8q5n" event={"ID":"0f319ea1-f399-41ba-81cd-edccb9905c98","Type":"ContainerStarted","Data":"006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.561271 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.586064 4713 generic.go:334] "Generic (PLEG): container finished" podID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerID="e5d52cf69f89e4aed1a905e924a2c925ec8432e1e1e0a66cc6aa84c7e8c2c09a" exitCode=0 Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.586193 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" event={"ID":"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3","Type":"ContainerDied","Data":"e5d52cf69f89e4aed1a905e924a2c925ec8432e1e1e0a66cc6aa84c7e8c2c09a"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.600855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6b4ad11-424d-4394-b809-9fb4e559e255","Type":"ContainerStarted","Data":"3472c8cedca4db56e0d5c72c8f6d1256e0b998249063a3e4ef2d47fa79c3d1dd"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.612089 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6979cc54d6-q8q5n" podStartSLOduration=4.612065959 podStartE2EDuration="4.612065959s" podCreationTimestamp="2026-03-14 05:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:48.585009717 +0000 UTC m=+1611.672919027" watchObservedRunningTime="2026-03-14 05:53:48.612065959 +0000 UTC m=+1611.699975279" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.613100 4713 generic.go:334] "Generic (PLEG): container finished" podID="bec0abd6-d181-493f-a285-932a17fac41d" containerID="317be9ba6837e17cd3e2c8e35046a0dc9dd0dd7e55036a9016f3e2b8d607e836" exitCode=0 Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.613174 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5n98m" event={"ID":"bec0abd6-d181-493f-a285-932a17fac41d","Type":"ContainerDied","Data":"317be9ba6837e17cd3e2c8e35046a0dc9dd0dd7e55036a9016f3e2b8d607e836"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.706632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-599cd54c4b-t7gdc" event={"ID":"f1d1368b-65d8-43fa-8025-0c41a7d0dd14","Type":"ContainerStarted","Data":"793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.706753 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.715994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5171aada-64eb-4788-8446-346549791051","Type":"ContainerStarted","Data":"ae1ca86d1e572a524d57b43b8ddb14218dfbf127480ef6eefca67c2621a0bcfe"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.730406 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.746128 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.747806 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.758619 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.760820 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.760964 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.776058 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.776092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d28jb" event={"ID":"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e","Type":"ContainerStarted","Data":"f3f13f36c98e8b06153b0ac1947d2634c46b6f2eee47aa5b0eaea2069f209efc"} Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.776389 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-599cd54c4b-t7gdc" podStartSLOduration=13.776371043 podStartE2EDuration="13.776371043s" podCreationTimestamp="2026-03-14 05:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:48.741875524 +0000 UTC m=+1611.829784824" watchObservedRunningTime="2026-03-14 05:53:48.776371043 +0000 UTC m=+1611.864280343" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.892598 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tth2x\" (UniqueName: \"kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.892991 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.895031 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.895101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.895227 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.895537 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:48 crc kubenswrapper[4713]: I0314 05:53:48.934770 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-782d-account-create-update-pglqr" event={"ID":"faa23366-9513-4a8e-af1f-1b6b7596a5ac","Type":"ContainerStarted","Data":"677d1231a67f4dd596979c1b218a784a48b411494858cff15931de060d72051a"} Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:48.996118 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004033 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004055 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tth2x\" (UniqueName: \"kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004289 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004410 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.004660 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.010877 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.011074 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.083310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.095442 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111653 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4b2f\" (UniqueName: \"kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111710 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111761 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111824 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111856 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.111878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.112151 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tth2x\" (UniqueName: \"kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.117670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.128363 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.128448 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.128893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs\") pod \"heat-api-5cd69694c5-ns9d6\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.129472 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.212812 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.212866 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.212889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.213014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4b2f\" (UniqueName: \"kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.213031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.213071 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.237476 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.245393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.272690 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.280839 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4b2f\" (UniqueName: \"kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.291258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.298878 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle\") pod \"heat-cfnapi-595498c55-cl7wg\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.596942 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.768288 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.943792 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zvm\" (UniqueName: \"kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm\") pod \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.944037 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts\") pod \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\" (UID: \"bd9da5a8-6d6b-404f-9cb7-3030364e35e2\") " Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.947446 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd9da5a8-6d6b-404f-9cb7-3030364e35e2" (UID: "bd9da5a8-6d6b-404f-9cb7-3030364e35e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.949035 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm" (OuterVolumeSpecName: "kube-api-access-n5zvm") pod "bd9da5a8-6d6b-404f-9cb7-3030364e35e2" (UID: "bd9da5a8-6d6b-404f-9cb7-3030364e35e2"). InnerVolumeSpecName "kube-api-access-n5zvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.972421 4713 generic.go:334] "Generic (PLEG): container finished" podID="e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" containerID="f3f13f36c98e8b06153b0ac1947d2634c46b6f2eee47aa5b0eaea2069f209efc" exitCode=0 Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.972502 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d28jb" event={"ID":"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e","Type":"ContainerDied","Data":"f3f13f36c98e8b06153b0ac1947d2634c46b6f2eee47aa5b0eaea2069f209efc"} Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.975154 4713 generic.go:334] "Generic (PLEG): container finished" podID="faa23366-9513-4a8e-af1f-1b6b7596a5ac" containerID="677d1231a67f4dd596979c1b218a784a48b411494858cff15931de060d72051a" exitCode=0 Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.975218 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-782d-account-create-update-pglqr" event={"ID":"faa23366-9513-4a8e-af1f-1b6b7596a5ac","Type":"ContainerDied","Data":"677d1231a67f4dd596979c1b218a784a48b411494858cff15931de060d72051a"} Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.981838 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vgcgm" event={"ID":"bd9da5a8-6d6b-404f-9cb7-3030364e35e2","Type":"ContainerDied","Data":"c40c993be315d7a04b85b9057dd079afc42339f81f3ceb116606b51c83a95b16"} Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.981884 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40c993be315d7a04b85b9057dd079afc42339f81f3ceb116606b51c83a95b16" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.981938 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vgcgm" Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.988830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerStarted","Data":"d677557d6954eec69c34f315885d278c5fcddac2f22ca4cd0cc858e11392e179"} Mar 14 05:53:49 crc kubenswrapper[4713]: I0314 05:53:49.997315 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d1b88956-89b2-49f5-881a-f757d005ee2a","Type":"ContainerStarted","Data":"02104eed73e5683784048be19afc6b3c5ada11195275f354a490827f6b0089d8"} Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.001994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" event={"ID":"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3","Type":"ContainerStarted","Data":"89312b22f73ba4ceb12ed97adaa5487d8bd45d9885947d3935142cdff17922f5"} Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.002281 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.041080 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" podStartSLOduration=15.041061437 podStartE2EDuration="15.041061437s" podCreationTimestamp="2026-03-14 05:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:50.037966039 +0000 UTC m=+1613.125875339" watchObservedRunningTime="2026-03-14 05:53:50.041061437 +0000 UTC m=+1613.128970737" Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.048906 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zvm\" (UniqueName: \"kubernetes.io/projected/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-kube-api-access-n5zvm\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.048939 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9da5a8-6d6b-404f-9cb7-3030364e35e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.981031 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:53:50 crc kubenswrapper[4713]: I0314 05:53:50.999136 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.014666 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5171aada-64eb-4788-8446-346549791051","Type":"ContainerStarted","Data":"7bbc9283c9423f9a5837b4734b9d1caa0d754e91c2b7e90e93c7753b9c92dd99"} Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.029712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6b4ad11-424d-4394-b809-9fb4e559e255","Type":"ContainerStarted","Data":"0d781d4682eaf1530927c1c223b6151c0de2e721e3d4707475c6ac14002ea632"} Mar 14 05:53:51 crc kubenswrapper[4713]: W0314 05:53:51.355553 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f775ed_2ab4_464b_96c5_ccb0bc3b570d.slice/crio-edb19b6275adc1504ca8e508b5090abc7d45a9d4f6f2e16e7c4e80ca3e1f9d43 WatchSource:0}: Error finding container edb19b6275adc1504ca8e508b5090abc7d45a9d4f6f2e16e7c4e80ca3e1f9d43: Status 404 returned error can't find the container with id edb19b6275adc1504ca8e508b5090abc7d45a9d4f6f2e16e7c4e80ca3e1f9d43 Mar 14 05:53:51 crc kubenswrapper[4713]: W0314 05:53:51.402377 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d297aa0_037f_4330_994c_8075c9126844.slice/crio-a9c8b2fc7b00ef995a738ba724774d40330eb00bd930c23f3395296fee969308 WatchSource:0}: Error finding container a9c8b2fc7b00ef995a738ba724774d40330eb00bd930c23f3395296fee969308: Status 404 returned error can't find the container with id a9c8b2fc7b00ef995a738ba724774d40330eb00bd930c23f3395296fee969308 Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.621648 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.634322 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.690740 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.719316 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.724697 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts\") pod \"da0a922f-47a8-482f-b2e5-b9fb6176c221\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.724966 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zlc6\" (UniqueName: \"kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6\") pod \"da0a922f-47a8-482f-b2e5-b9fb6176c221\" (UID: \"da0a922f-47a8-482f-b2e5-b9fb6176c221\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.764879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da0a922f-47a8-482f-b2e5-b9fb6176c221" (UID: "da0a922f-47a8-482f-b2e5-b9fb6176c221"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.764962 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.766612 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0a922f-47a8-482f-b2e5-b9fb6176c221-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.781774 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6" (OuterVolumeSpecName: "kube-api-access-4zlc6") pod "da0a922f-47a8-482f-b2e5-b9fb6176c221" (UID: "da0a922f-47a8-482f-b2e5-b9fb6176c221"). InnerVolumeSpecName "kube-api-access-4zlc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.876834 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjs9p\" (UniqueName: \"kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p\") pod \"25be920e-162b-4f60-851b-228167576b04\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.876967 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts\") pod \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877079 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbbqh\" (UniqueName: \"kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh\") pod \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877189 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhzw\" (UniqueName: \"kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw\") pod \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\" (UID: \"faa23366-9513-4a8e-af1f-1b6b7596a5ac\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts\") pod \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\" (UID: \"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bfr4\" (UniqueName: \"kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4\") pod \"bec0abd6-d181-493f-a285-932a17fac41d\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877345 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts\") pod \"bec0abd6-d181-493f-a285-932a17fac41d\" (UID: \"bec0abd6-d181-493f-a285-932a17fac41d\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877404 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts\") pod \"25be920e-162b-4f60-851b-228167576b04\" (UID: \"25be920e-162b-4f60-851b-228167576b04\") " Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.877964 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" (UID: "e3b63ac3-87f8-43e2-b546-0cd9d6025e7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.878948 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25be920e-162b-4f60-851b-228167576b04" (UID: "25be920e-162b-4f60-851b-228167576b04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.879441 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bec0abd6-d181-493f-a285-932a17fac41d" (UID: "bec0abd6-d181-493f-a285-932a17fac41d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.879468 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zlc6\" (UniqueName: \"kubernetes.io/projected/da0a922f-47a8-482f-b2e5-b9fb6176c221-kube-api-access-4zlc6\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.879491 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25be920e-162b-4f60-851b-228167576b04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.879502 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.880907 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "faa23366-9513-4a8e-af1f-1b6b7596a5ac" (UID: "faa23366-9513-4a8e-af1f-1b6b7596a5ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.883578 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4" (OuterVolumeSpecName: "kube-api-access-4bfr4") pod "bec0abd6-d181-493f-a285-932a17fac41d" (UID: "bec0abd6-d181-493f-a285-932a17fac41d"). InnerVolumeSpecName "kube-api-access-4bfr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.896446 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh" (OuterVolumeSpecName: "kube-api-access-rbbqh") pod "e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" (UID: "e3b63ac3-87f8-43e2-b546-0cd9d6025e7e"). InnerVolumeSpecName "kube-api-access-rbbqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.915451 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p" (OuterVolumeSpecName: "kube-api-access-gjs9p") pod "25be920e-162b-4f60-851b-228167576b04" (UID: "25be920e-162b-4f60-851b-228167576b04"). InnerVolumeSpecName "kube-api-access-gjs9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.927033 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw" (OuterVolumeSpecName: "kube-api-access-vlhzw") pod "faa23366-9513-4a8e-af1f-1b6b7596a5ac" (UID: "faa23366-9513-4a8e-af1f-1b6b7596a5ac"). InnerVolumeSpecName "kube-api-access-vlhzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982858 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbbqh\" (UniqueName: \"kubernetes.io/projected/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e-kube-api-access-rbbqh\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982893 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhzw\" (UniqueName: \"kubernetes.io/projected/faa23366-9513-4a8e-af1f-1b6b7596a5ac-kube-api-access-vlhzw\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982903 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bfr4\" (UniqueName: \"kubernetes.io/projected/bec0abd6-d181-493f-a285-932a17fac41d-kube-api-access-4bfr4\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982913 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bec0abd6-d181-493f-a285-932a17fac41d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982922 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjs9p\" (UniqueName: \"kubernetes.io/projected/25be920e-162b-4f60-851b-228167576b04-kube-api-access-gjs9p\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:51 crc kubenswrapper[4713]: I0314 05:53:51.982931 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa23366-9513-4a8e-af1f-1b6b7596a5ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.050804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" event={"ID":"da0a922f-47a8-482f-b2e5-b9fb6176c221","Type":"ContainerDied","Data":"303d60c524ab6d91e414751fba0c977cf186f9d3b5c9a4d7074f653b70d68f58"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.050849 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcce-account-create-update-ftbtx" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.050858 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="303d60c524ab6d91e414751fba0c977cf186f9d3b5c9a4d7074f653b70d68f58" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.053334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-595498c55-cl7wg" event={"ID":"2d297aa0-037f-4330-994c-8075c9126844","Type":"ContainerStarted","Data":"a9c8b2fc7b00ef995a738ba724774d40330eb00bd930c23f3395296fee969308"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.060323 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d28jb" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.060328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d28jb" event={"ID":"e3b63ac3-87f8-43e2-b546-0cd9d6025e7e","Type":"ContainerDied","Data":"b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.060454 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5eb5625be77daa7359ac853aedbf4e043ce126b286a016bdc107cc2fb91c554" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.062847 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5n98m" event={"ID":"bec0abd6-d181-493f-a285-932a17fac41d","Type":"ContainerDied","Data":"36c17a373ae7f4dd246052dca70cbe35f38c776c6b33bc7558e63f476526db34"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.062876 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5n98m" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.062888 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c17a373ae7f4dd246052dca70cbe35f38c776c6b33bc7558e63f476526db34" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.067385 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cd69694c5-ns9d6" event={"ID":"19f775ed-2ab4-464b-96c5-ccb0bc3b570d","Type":"ContainerStarted","Data":"edb19b6275adc1504ca8e508b5090abc7d45a9d4f6f2e16e7c4e80ca3e1f9d43"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.072197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" event={"ID":"25be920e-162b-4f60-851b-228167576b04","Type":"ContainerDied","Data":"947da072b050f303f0b2f319ea15237e8b31f580b2d4e40739eb98496f7eaa32"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.072266 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947da072b050f303f0b2f319ea15237e8b31f580b2d4e40739eb98496f7eaa32" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.072352 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9002-account-create-update-mlmq6" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.079688 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-782d-account-create-update-pglqr" event={"ID":"faa23366-9513-4a8e-af1f-1b6b7596a5ac","Type":"ContainerDied","Data":"76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31"} Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.079732 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76cf1e920fa7496ad506eda8a7c0fe65795a9d3e47ad1d497fffe14fab32ac31" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.079815 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-782d-account-create-update-pglqr" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295076 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295742 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295766 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295786 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9da5a8-6d6b-404f-9cb7-3030364e35e2" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295794 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9da5a8-6d6b-404f-9cb7-3030364e35e2" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295821 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec0abd6-d181-493f-a285-932a17fac41d" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295829 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec0abd6-d181-493f-a285-932a17fac41d" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295847 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0a922f-47a8-482f-b2e5-b9fb6176c221" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295854 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0a922f-47a8-482f-b2e5-b9fb6176c221" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295873 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25be920e-162b-4f60-851b-228167576b04" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295880 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="25be920e-162b-4f60-851b-228167576b04" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: E0314 05:53:52.295899 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa23366-9513-4a8e-af1f-1b6b7596a5ac" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.295906 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa23366-9513-4a8e-af1f-1b6b7596a5ac" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296249 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec0abd6-d181-493f-a285-932a17fac41d" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296268 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9da5a8-6d6b-404f-9cb7-3030364e35e2" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296291 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="25be920e-162b-4f60-851b-228167576b04" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296305 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" containerName="mariadb-database-create" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296331 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa23366-9513-4a8e-af1f-1b6b7596a5ac" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.296351 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0a922f-47a8-482f-b2e5-b9fb6176c221" containerName="mariadb-account-create-update" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.298617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.322008 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.356758 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.364488 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59cfcdd844-lx8mr" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.394763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fsz\" (UniqueName: \"kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.394855 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.394949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.471730 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.472044 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b9cc97768-hg4ff" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-log" containerID="cri-o://52e10daf48f474de7b04b71da57a7ef18a8d5d49e5c13edbdd7d11398dad1f6e" gracePeriod=30 Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.473878 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b9cc97768-hg4ff" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-api" containerID="cri-o://ae72f01107aa077345da42e895c2ca165408431f7b73013f1bc47b2dbd23a6b9" gracePeriod=30 Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.504090 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2fsz\" (UniqueName: \"kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.504464 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.504549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.505185 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.505283 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.548597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2fsz\" (UniqueName: \"kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz\") pod \"community-operators-sjxhl\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:52 crc kubenswrapper[4713]: I0314 05:53:52.631093 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:53:53 crc kubenswrapper[4713]: I0314 05:53:53.146001 4713 generic.go:334] "Generic (PLEG): container finished" podID="805c8988-dae7-41ae-8160-75ad28990e12" containerID="52e10daf48f474de7b04b71da57a7ef18a8d5d49e5c13edbdd7d11398dad1f6e" exitCode=143 Mar 14 05:53:53 crc kubenswrapper[4713]: I0314 05:53:53.146117 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerDied","Data":"52e10daf48f474de7b04b71da57a7ef18a8d5d49e5c13edbdd7d11398dad1f6e"} Mar 14 05:53:54 crc kubenswrapper[4713]: I0314 05:53:54.006461 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:53:54 crc kubenswrapper[4713]: I0314 05:53:54.211587 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerStarted","Data":"1c504176b75d6dcb307133ad04c70aa8e7fb3e15ede67e83941e8f7a59716906"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.231017 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cffcd59cb-mklc8" event={"ID":"d0f3b8e7-bac4-47da-8b3f-c18d437789e4","Type":"ContainerStarted","Data":"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.231498 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6cffcd59cb-mklc8" podUID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" containerName="heat-api" containerID="cri-o://bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3" gracePeriod=60 Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.231828 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.234072 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cd69694c5-ns9d6" event={"ID":"19f775ed-2ab4-464b-96c5-ccb0bc3b570d","Type":"ContainerStarted","Data":"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.234481 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.243526 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" event={"ID":"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c","Type":"ContainerStarted","Data":"3f498a7d7eab4a99bb1cd3cb28fcb2e2b1bc211f98ac82859d3017abc3c1e54c"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.243689 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" podUID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" containerName="heat-cfnapi" containerID="cri-o://3f498a7d7eab4a99bb1cd3cb28fcb2e2b1bc211f98ac82859d3017abc3c1e54c" gracePeriod=60 Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.243953 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.257777 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" event={"ID":"473ca677-e1ad-431f-9f9e-c8260e43cda2","Type":"ContainerStarted","Data":"9bdfa92ea3df746b61ff35d8948fab8a1dcb5e40974707ddb44990b9f35fbea1"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.259134 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.263645 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerStarted","Data":"c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.267626 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d1b88956-89b2-49f5-881a-f757d005ee2a","Type":"ContainerStarted","Data":"22c158c086346fb15ffe6bf4ed1c669de2a31cee5457732f9dd21171539c2a64"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.268925 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.270620 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6cffcd59cb-mklc8" podStartSLOduration=13.376674113 podStartE2EDuration="20.270598652s" podCreationTimestamp="2026-03-14 05:53:35 +0000 UTC" firstStartedPulling="2026-03-14 05:53:46.477584321 +0000 UTC m=+1609.565493621" lastFinishedPulling="2026-03-14 05:53:53.37150886 +0000 UTC m=+1616.459418160" observedRunningTime="2026-03-14 05:53:55.268619599 +0000 UTC m=+1618.356528899" watchObservedRunningTime="2026-03-14 05:53:55.270598652 +0000 UTC m=+1618.358507952" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.296261 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5171aada-64eb-4788-8446-346549791051","Type":"ContainerStarted","Data":"4c6df61230fccf0a085c0c0516cb75829eac4810af20f36b39f0742c5fce1f22"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.311724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerStarted","Data":"8da51f5adb108f8cd658a1e20671519fcbdc941d8cc62c4a6c354d817d8225c9"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.313262 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.323031 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" podStartSLOduration=13.369680518 podStartE2EDuration="20.323010951s" podCreationTimestamp="2026-03-14 05:53:35 +0000 UTC" firstStartedPulling="2026-03-14 05:53:46.43960293 +0000 UTC m=+1609.527512230" lastFinishedPulling="2026-03-14 05:53:53.392933363 +0000 UTC m=+1616.480842663" observedRunningTime="2026-03-14 05:53:55.292770538 +0000 UTC m=+1618.380679848" watchObservedRunningTime="2026-03-14 05:53:55.323010951 +0000 UTC m=+1618.410920251" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.351371 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6b4ad11-424d-4394-b809-9fb4e559e255","Type":"ContainerStarted","Data":"d033f16bb63e2004278cbb75a79f5f9fe4dfe02d0ddefd0e67fc4a3d55ce7e59"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.399027 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-595498c55-cl7wg" event={"ID":"2d297aa0-037f-4330-994c-8075c9126844","Type":"ContainerStarted","Data":"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e"} Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.399054 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.399040503 podStartE2EDuration="10.399040503s" podCreationTimestamp="2026-03-14 05:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:55.32076979 +0000 UTC m=+1618.408679090" watchObservedRunningTime="2026-03-14 05:53:55.399040503 +0000 UTC m=+1618.486949803" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.400179 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.425884 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5cd69694c5-ns9d6" podStartSLOduration=5.193487091 podStartE2EDuration="7.425858968s" podCreationTimestamp="2026-03-14 05:53:48 +0000 UTC" firstStartedPulling="2026-03-14 05:53:51.359486553 +0000 UTC m=+1614.447395853" lastFinishedPulling="2026-03-14 05:53:53.59185841 +0000 UTC m=+1616.679767730" observedRunningTime="2026-03-14 05:53:55.353827563 +0000 UTC m=+1618.441736873" watchObservedRunningTime="2026-03-14 05:53:55.425858968 +0000 UTC m=+1618.513768268" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.460745 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" podStartSLOduration=5.218372022 podStartE2EDuration="11.460713788s" podCreationTimestamp="2026-03-14 05:53:44 +0000 UTC" firstStartedPulling="2026-03-14 05:53:47.13621949 +0000 UTC m=+1610.224128790" lastFinishedPulling="2026-03-14 05:53:53.378561246 +0000 UTC m=+1616.466470556" observedRunningTime="2026-03-14 05:53:55.39548002 +0000 UTC m=+1618.483389320" watchObservedRunningTime="2026-03-14 05:53:55.460713788 +0000 UTC m=+1618.548623108" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.481638 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.481604594 podStartE2EDuration="10.481604594s" podCreationTimestamp="2026-03-14 05:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:55.417110609 +0000 UTC m=+1618.505019909" watchObservedRunningTime="2026-03-14 05:53:55.481604594 +0000 UTC m=+1618.569513894" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.509493 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-b896f6bb4-lbjfr" podStartSLOduration=5.255549966 podStartE2EDuration="11.509471681s" podCreationTimestamp="2026-03-14 05:53:44 +0000 UTC" firstStartedPulling="2026-03-14 05:53:47.139498664 +0000 UTC m=+1610.227407964" lastFinishedPulling="2026-03-14 05:53:53.393420369 +0000 UTC m=+1616.481329679" observedRunningTime="2026-03-14 05:53:55.451765232 +0000 UTC m=+1618.539674552" watchObservedRunningTime="2026-03-14 05:53:55.509471681 +0000 UTC m=+1618.597380981" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.523382 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.523361304 podStartE2EDuration="10.523361304s" podCreationTimestamp="2026-03-14 05:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:53:55.48149837 +0000 UTC m=+1618.569407680" watchObservedRunningTime="2026-03-14 05:53:55.523361304 +0000 UTC m=+1618.611270604" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.552677 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-595498c55-cl7wg" podStartSLOduration=5.4860048169999995 podStartE2EDuration="7.552650886s" podCreationTimestamp="2026-03-14 05:53:48 +0000 UTC" firstStartedPulling="2026-03-14 05:53:51.406730777 +0000 UTC m=+1614.494640087" lastFinishedPulling="2026-03-14 05:53:53.473376856 +0000 UTC m=+1616.561286156" observedRunningTime="2026-03-14 05:53:55.507805408 +0000 UTC m=+1618.595714708" watchObservedRunningTime="2026-03-14 05:53:55.552650886 +0000 UTC m=+1618.640560196" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.728272 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrsf8"] Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.730649 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.735627 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.735893 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.736113 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2xfb9" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.755300 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrsf8"] Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.898653 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.898711 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.898870 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxn9\" (UniqueName: \"kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.899058 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:55 crc kubenswrapper[4713]: I0314 05:53:55.919940 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.009007 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.009072 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.009120 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxn9\" (UniqueName: \"kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.009199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.048589 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.061740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.078770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.100998 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxn9\" (UniqueName: \"kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9\") pod \"nova-cell0-conductor-db-sync-zrsf8\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.201087 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.225527 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="dnsmasq-dns" containerID="cri-o://756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78" gracePeriod=10 Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.363923 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.390912 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.391665 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.497515 4713 generic.go:334] "Generic (PLEG): container finished" podID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerID="9bdfa92ea3df746b61ff35d8948fab8a1dcb5e40974707ddb44990b9f35fbea1" exitCode=1 Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.497621 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" event={"ID":"473ca677-e1ad-431f-9f9e-c8260e43cda2","Type":"ContainerDied","Data":"9bdfa92ea3df746b61ff35d8948fab8a1dcb5e40974707ddb44990b9f35fbea1"} Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.498494 4713 scope.go:117] "RemoveContainer" containerID="9bdfa92ea3df746b61ff35d8948fab8a1dcb5e40974707ddb44990b9f35fbea1" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.584618 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.614621 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerStarted","Data":"08233fd3048ee585cd90b9e5a2f61b6e70c48aedef235f462fe9c615c3fa08b4"} Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.638444 4713 generic.go:334] "Generic (PLEG): container finished" podID="af1eb3bb-1126-4289-9156-a804d676272f" containerID="8da51f5adb108f8cd658a1e20671519fcbdc941d8cc62c4a6c354d817d8225c9" exitCode=1 Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.638507 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerDied","Data":"8da51f5adb108f8cd658a1e20671519fcbdc941d8cc62c4a6c354d817d8225c9"} Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.639296 4713 scope.go:117] "RemoveContainer" containerID="8da51f5adb108f8cd658a1e20671519fcbdc941d8cc62c4a6c354d817d8225c9" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.661279 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.687051 4713 generic.go:334] "Generic (PLEG): container finished" podID="0aeae175-1650-4587-b866-00a6d082a849" containerID="7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea" exitCode=0 Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.687152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerDied","Data":"7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea"} Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.695809 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.695891 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.826192 4713 generic.go:334] "Generic (PLEG): container finished" podID="805c8988-dae7-41ae-8160-75ad28990e12" containerID="ae72f01107aa077345da42e895c2ca165408431f7b73013f1bc47b2dbd23a6b9" exitCode=0 Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.826853 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerDied","Data":"ae72f01107aa077345da42e895c2ca165408431f7b73013f1bc47b2dbd23a6b9"} Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.829457 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.829502 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.903864 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.939197 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:53:56 crc kubenswrapper[4713]: I0314 05:53:56.960245 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059003 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059093 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059189 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059299 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rkz\" (UniqueName: \"kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059347 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.059484 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs\") pod \"805c8988-dae7-41ae-8160-75ad28990e12\" (UID: \"805c8988-dae7-41ae-8160-75ad28990e12\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.066639 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs" (OuterVolumeSpecName: "logs") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.080844 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz" (OuterVolumeSpecName: "kube-api-access-94rkz") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "kube-api-access-94rkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.091512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts" (OuterVolumeSpecName: "scripts") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.164461 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rkz\" (UniqueName: \"kubernetes.io/projected/805c8988-dae7-41ae-8160-75ad28990e12-kube-api-access-94rkz\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.164508 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.164523 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/805c8988-dae7-41ae-8160-75ad28990e12-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.197759 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data" (OuterVolumeSpecName: "config-data") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.275129 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.304484 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrsf8"] Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.317729 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.388767 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.431933 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.492529 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.525236 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.534555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "805c8988-dae7-41ae-8160-75ad28990e12" (UID: "805c8988-dae7-41ae-8160-75ad28990e12"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.606552 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/805c8988-dae7-41ae-8160-75ad28990e12-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.708694 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.708769 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpx8d\" (UniqueName: \"kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.708835 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.708911 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.709071 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.709119 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config\") pod \"02ae920c-9439-4c60-904b-bea08ca59dac\" (UID: \"02ae920c-9439-4c60-904b-bea08ca59dac\") " Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.725605 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d" (OuterVolumeSpecName: "kube-api-access-zpx8d") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "kube-api-access-zpx8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.806474 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.812139 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpx8d\" (UniqueName: \"kubernetes.io/projected/02ae920c-9439-4c60-904b-bea08ca59dac-kube-api-access-zpx8d\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.812174 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.815523 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.852197 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config" (OuterVolumeSpecName: "config") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.866829 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b9cc97768-hg4ff" event={"ID":"805c8988-dae7-41ae-8160-75ad28990e12","Type":"ContainerDied","Data":"0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281"} Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.866880 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b9cc97768-hg4ff" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.866916 4713 scope.go:117] "RemoveContainer" containerID="ae72f01107aa077345da42e895c2ca165408431f7b73013f1bc47b2dbd23a6b9" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.868590 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.873697 4713 generic.go:334] "Generic (PLEG): container finished" podID="02ae920c-9439-4c60-904b-bea08ca59dac" containerID="756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78" exitCode=0 Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.873773 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerDied","Data":"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78"} Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.873801 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" event={"ID":"02ae920c-9439-4c60-904b-bea08ca59dac","Type":"ContainerDied","Data":"4ad836fb6b9e13e67c2e2ba7ea17e7aea7020bbe756fa0150f8ca509535b927b"} Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.873921 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.878656 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" event={"ID":"0cd56fa6-325c-4813-bffe-a2cd3bf82257","Type":"ContainerStarted","Data":"025ccdd158213340d3559a8af8b26bc234ed78ab1c3e94d4e6df69a8e584d005"} Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.881663 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.881708 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.920607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02ae920c-9439-4c60-904b-bea08ca59dac" (UID: "02ae920c-9439-4c60-904b-bea08ca59dac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.923988 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.924025 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.924039 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.924048 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ae920c-9439-4c60-904b-bea08ca59dac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:57 crc kubenswrapper[4713]: I0314 05:53:57.941552 4713 scope.go:117] "RemoveContainer" containerID="52e10daf48f474de7b04b71da57a7ef18a8d5d49e5c13edbdd7d11398dad1f6e" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.019350 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.033163 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b9cc97768-hg4ff"] Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.215877 4713 scope.go:117] "RemoveContainer" containerID="756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.220012 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.231950 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fj45w"] Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.253499 4713 scope.go:117] "RemoveContainer" containerID="7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.279405 4713 scope.go:117] "RemoveContainer" containerID="756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78" Mar 14 05:53:58 crc kubenswrapper[4713]: E0314 05:53:58.282745 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78\": container with ID starting with 756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78 not found: ID does not exist" containerID="756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.282790 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78"} err="failed to get container status \"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78\": rpc error: code = NotFound desc = could not find container \"756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78\": container with ID starting with 756eee930b3bce0cf557242f49403b6f695e3982e832f0744fc14ca3faf97f78 not found: ID does not exist" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.282831 4713 scope.go:117] "RemoveContainer" containerID="7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b" Mar 14 05:53:58 crc kubenswrapper[4713]: E0314 05:53:58.283274 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b\": container with ID starting with 7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b not found: ID does not exist" containerID="7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.283329 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b"} err="failed to get container status \"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b\": rpc error: code = NotFound desc = could not find container \"7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b\": container with ID starting with 7e99ce6fe1a31dbcd14e549366054d673de91654c52ed9511f3f37369854673b not found: ID does not exist" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.813804 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.923451 4713 generic.go:334] "Generic (PLEG): container finished" podID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" exitCode=1 Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.923861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" event={"ID":"473ca677-e1ad-431f-9f9e-c8260e43cda2","Type":"ContainerDied","Data":"f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d"} Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.923895 4713 scope.go:117] "RemoveContainer" containerID="9bdfa92ea3df746b61ff35d8948fab8a1dcb5e40974707ddb44990b9f35fbea1" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.924495 4713 scope.go:117] "RemoveContainer" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" Mar 14 05:53:58 crc kubenswrapper[4713]: E0314 05:53:58.925130 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5955cd59bf-ttt8j_openstack(473ca677-e1ad-431f-9f9e-c8260e43cda2)\"" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.964400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerStarted","Data":"e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686"} Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.965525 4713 scope.go:117] "RemoveContainer" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" Mar 14 05:53:58 crc kubenswrapper[4713]: E0314 05:53:58.965849 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-b896f6bb4-lbjfr_openstack(af1eb3bb-1126-4289-9156-a804d676272f)\"" pod="openstack/heat-api-b896f6bb4-lbjfr" podUID="af1eb3bb-1126-4289-9156-a804d676272f" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.982982 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chbwh\" (UniqueName: \"kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh\") pod \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.983092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data\") pod \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.983215 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle\") pod \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.983336 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom\") pod \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\" (UID: \"d0f3b8e7-bac4-47da-8b3f-c18d437789e4\") " Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.993060 4713 generic.go:334] "Generic (PLEG): container finished" podID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" containerID="bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3" exitCode=0 Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.993118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cffcd59cb-mklc8" event={"ID":"d0f3b8e7-bac4-47da-8b3f-c18d437789e4","Type":"ContainerDied","Data":"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3"} Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.993142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cffcd59cb-mklc8" event={"ID":"d0f3b8e7-bac4-47da-8b3f-c18d437789e4","Type":"ContainerDied","Data":"6af596e50714b2143f24e0c81a3b7cb8c1e1ca0c2dafcda983590ffb9ae3b0e6"} Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.993191 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cffcd59cb-mklc8" Mar 14 05:53:58 crc kubenswrapper[4713]: I0314 05:53:58.998633 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh" (OuterVolumeSpecName: "kube-api-access-chbwh") pod "d0f3b8e7-bac4-47da-8b3f-c18d437789e4" (UID: "d0f3b8e7-bac4-47da-8b3f-c18d437789e4"). InnerVolumeSpecName "kube-api-access-chbwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.017875 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.021575 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0f3b8e7-bac4-47da-8b3f-c18d437789e4" (UID: "d0f3b8e7-bac4-47da-8b3f-c18d437789e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.056007 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f3b8e7-bac4-47da-8b3f-c18d437789e4" (UID: "d0f3b8e7-bac4-47da-8b3f-c18d437789e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.089010 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.089047 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chbwh\" (UniqueName: \"kubernetes.io/projected/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-kube-api-access-chbwh\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.089059 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.098820 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data" (OuterVolumeSpecName: "config-data") pod "d0f3b8e7-bac4-47da-8b3f-c18d437789e4" (UID: "d0f3b8e7-bac4-47da-8b3f-c18d437789e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.192030 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b8e7-bac4-47da-8b3f-c18d437789e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.276473 4713 scope.go:117] "RemoveContainer" containerID="bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.337558 4713 scope.go:117] "RemoveContainer" containerID="bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3" Mar 14 05:53:59 crc kubenswrapper[4713]: E0314 05:53:59.339431 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3\": container with ID starting with bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3 not found: ID does not exist" containerID="bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.339471 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3"} err="failed to get container status \"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3\": rpc error: code = NotFound desc = could not find container \"bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3\": container with ID starting with bff24db266efaac354aeb38e9f3e21807795cb8d7d0fffbd7ce26e2fa75c9dd3 not found: ID does not exist" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.399535 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.404558 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6cffcd59cb-mklc8"] Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.578191 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" path="/var/lib/kubelet/pods/02ae920c-9439-4c60-904b-bea08ca59dac/volumes" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.578925 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805c8988-dae7-41ae-8160-75ad28990e12" path="/var/lib/kubelet/pods/805c8988-dae7-41ae-8160-75ad28990e12/volumes" Mar 14 05:53:59 crc kubenswrapper[4713]: I0314 05:53:59.579572 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" path="/var/lib/kubelet/pods/d0f3b8e7-bac4-47da-8b3f-c18d437789e4/volumes" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.045659 4713 generic.go:334] "Generic (PLEG): container finished" podID="af1eb3bb-1126-4289-9156-a804d676272f" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" exitCode=1 Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.045768 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerDied","Data":"e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686"} Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.045825 4713 scope.go:117] "RemoveContainer" containerID="8da51f5adb108f8cd658a1e20671519fcbdc941d8cc62c4a6c354d817d8225c9" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.046863 4713 scope.go:117] "RemoveContainer" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.047392 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-b896f6bb4-lbjfr_openstack(af1eb3bb-1126-4289-9156-a804d676272f)\"" pod="openstack/heat-api-b896f6bb4-lbjfr" podUID="af1eb3bb-1126-4289-9156-a804d676272f" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.060306 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerStarted","Data":"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef"} Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.072854 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.072958 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.073911 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"869960ea-c2fe-4a61-8f70-2e7724af6426","Type":"ContainerStarted","Data":"089acb9db1872f2c3f535abfbb64511cd6c32f2e3e32c7c3e8a8c0028efaefbd"} Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.096116 4713 scope.go:117] "RemoveContainer" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.096489 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5955cd59bf-ttt8j_openstack(473ca677-e1ad-431f-9f9e-c8260e43cda2)\"" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.114388 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.114484 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.122145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.027405442 podStartE2EDuration="35.122125786s" podCreationTimestamp="2026-03-14 05:53:25 +0000 UTC" firstStartedPulling="2026-03-14 05:53:26.426078966 +0000 UTC m=+1589.513988266" lastFinishedPulling="2026-03-14 05:53:58.5207993 +0000 UTC m=+1621.608708610" observedRunningTime="2026-03-14 05:54:00.120411212 +0000 UTC m=+1623.208320522" watchObservedRunningTime="2026-03-14 05:54:00.122125786 +0000 UTC m=+1623.210035086" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.123454 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.124161 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerStarted","Data":"e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478"} Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.124358 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.239081 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.649243266 podStartE2EDuration="15.23904816s" podCreationTimestamp="2026-03-14 05:53:45 +0000 UTC" firstStartedPulling="2026-03-14 05:53:47.92797914 +0000 UTC m=+1611.015888440" lastFinishedPulling="2026-03-14 05:53:58.517784033 +0000 UTC m=+1621.605693334" observedRunningTime="2026-03-14 05:54:00.162143311 +0000 UTC m=+1623.250052631" watchObservedRunningTime="2026-03-14 05:54:00.23904816 +0000 UTC m=+1623.326957460" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.339737 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557794-zhr59"] Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.341147 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="init" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.341266 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="init" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.341362 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-log" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.341446 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-log" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.341555 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-api" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.341658 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-api" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.341768 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" containerName="heat-api" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.341845 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" containerName="heat-api" Mar 14 05:54:00 crc kubenswrapper[4713]: E0314 05:54:00.345367 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="dnsmasq-dns" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.345505 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="dnsmasq-dns" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.346146 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-log" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.346629 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="dnsmasq-dns" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.346740 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f3b8e7-bac4-47da-8b3f-c18d437789e4" containerName="heat-api" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.348444 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="805c8988-dae7-41ae-8160-75ad28990e12" containerName="placement-api" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.349850 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.352596 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.352761 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.353041 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.385282 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557794-zhr59"] Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.448932 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv4g\" (UniqueName: \"kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g\") pod \"auto-csr-approver-29557794-zhr59\" (UID: \"d80784b9-9a04-4aca-9515-d9540532b039\") " pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.550735 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv4g\" (UniqueName: \"kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g\") pod \"auto-csr-approver-29557794-zhr59\" (UID: \"d80784b9-9a04-4aca-9515-d9540532b039\") " pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.589044 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv4g\" (UniqueName: \"kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g\") pod \"auto-csr-approver-29557794-zhr59\" (UID: \"d80784b9-9a04-4aca-9515-d9540532b039\") " pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:00 crc kubenswrapper[4713]: I0314 05:54:00.685910 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.047567 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.094568 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.184730 4713 scope.go:117] "RemoveContainer" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" Mar 14 05:54:01 crc kubenswrapper[4713]: E0314 05:54:01.185183 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-b896f6bb4-lbjfr_openstack(af1eb3bb-1126-4289-9156-a804d676272f)\"" pod="openstack/heat-api-b896f6bb4-lbjfr" podUID="af1eb3bb-1126-4289-9156-a804d676272f" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.186455 4713 scope.go:117] "RemoveContainer" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" Mar 14 05:54:01 crc kubenswrapper[4713]: E0314 05:54:01.186681 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5955cd59bf-ttt8j_openstack(473ca677-e1ad-431f-9f9e-c8260e43cda2)\"" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.523759 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557794-zhr59"] Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.925884 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.931056 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:01 crc kubenswrapper[4713]: I0314 05:54:01.940942 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.006228 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.006285 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znlt\" (UniqueName: \"kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.006369 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.025895 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.108303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.108707 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.108764 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znlt\" (UniqueName: \"kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.108834 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.109135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.133954 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znlt\" (UniqueName: \"kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt\") pod \"certified-operators-dzqjx\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.173989 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-fj45w" podUID="02ae920c-9439-4c60-904b-bea08ca59dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.209:5353: i/o timeout" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.224437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557794-zhr59" event={"ID":"d80784b9-9a04-4aca-9515-d9540532b039","Type":"ContainerStarted","Data":"b55c4a54b9e94f86d4b851550a2a048b7f5dbfce173622b1b0dd693b70669088"} Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.239610 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.244629 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.244627 4713 generic.go:334] "Generic (PLEG): container finished" podID="0aeae175-1650-4587-b866-00a6d082a849" containerID="b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef" exitCode=0 Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.244664 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerDied","Data":"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef"} Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245336 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-central-agent" containerID="cri-o://d677557d6954eec69c34f315885d278c5fcddac2f22ca4cd0cc858e11392e179" gracePeriod=30 Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245364 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="proxy-httpd" containerID="cri-o://e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478" gracePeriod=30 Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245452 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="sg-core" containerID="cri-o://08233fd3048ee585cd90b9e5a2f61b6e70c48aedef235f462fe9c615c3fa08b4" gracePeriod=30 Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245484 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-notification-agent" containerID="cri-o://c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d" gracePeriod=30 Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245496 4713 scope.go:117] "RemoveContainer" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" Mar 14 05:54:02 crc kubenswrapper[4713]: E0314 05:54:02.245788 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-b896f6bb4-lbjfr_openstack(af1eb3bb-1126-4289-9156-a804d676272f)\"" pod="openstack/heat-api-b896f6bb4-lbjfr" podUID="af1eb3bb-1126-4289-9156-a804d676272f" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.245830 4713 scope.go:117] "RemoveContainer" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" Mar 14 05:54:02 crc kubenswrapper[4713]: E0314 05:54:02.246054 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5955cd59bf-ttt8j_openstack(473ca677-e1ad-431f-9f9e-c8260e43cda2)\"" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.263042 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.329259 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.461527 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.588918 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:54:02 crc kubenswrapper[4713]: I0314 05:54:02.718839 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.200904 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:03 crc kubenswrapper[4713]: E0314 05:54:03.303157 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:03 crc kubenswrapper[4713]: E0314 05:54:03.311288 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17433094_f4d7_4059_a74a_80ee3dff9981.slice/crio-e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17433094_f4d7_4059_a74a_80ee3dff9981.slice/crio-conmon-c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17433094_f4d7_4059_a74a_80ee3dff9981.slice/crio-c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17433094_f4d7_4059_a74a_80ee3dff9981.slice/crio-conmon-e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.343645 4713 generic.go:334] "Generic (PLEG): container finished" podID="17433094-f4d7-4059-a74a-80ee3dff9981" containerID="e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478" exitCode=0 Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.343686 4713 generic.go:334] "Generic (PLEG): container finished" podID="17433094-f4d7-4059-a74a-80ee3dff9981" containerID="08233fd3048ee585cd90b9e5a2f61b6e70c48aedef235f462fe9c615c3fa08b4" exitCode=2 Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.343697 4713 generic.go:334] "Generic (PLEG): container finished" podID="17433094-f4d7-4059-a74a-80ee3dff9981" containerID="c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d" exitCode=0 Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.344947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerDied","Data":"e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478"} Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.344983 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerDied","Data":"08233fd3048ee585cd90b9e5a2f61b6e70c48aedef235f462fe9c615c3fa08b4"} Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.344995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerDied","Data":"c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d"} Mar 14 05:54:03 crc kubenswrapper[4713]: I0314 05:54:03.806928 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.092250 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.099138 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dz7t\" (UniqueName: \"kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t\") pod \"af1eb3bb-1126-4289-9156-a804d676272f\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.099279 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle\") pod \"af1eb3bb-1126-4289-9156-a804d676272f\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.099359 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom\") pod \"af1eb3bb-1126-4289-9156-a804d676272f\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.099496 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data\") pod \"af1eb3bb-1126-4289-9156-a804d676272f\" (UID: \"af1eb3bb-1126-4289-9156-a804d676272f\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.122403 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af1eb3bb-1126-4289-9156-a804d676272f" (UID: "af1eb3bb-1126-4289-9156-a804d676272f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.122866 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t" (OuterVolumeSpecName: "kube-api-access-2dz7t") pod "af1eb3bb-1126-4289-9156-a804d676272f" (UID: "af1eb3bb-1126-4289-9156-a804d676272f"). InnerVolumeSpecName "kube-api-access-2dz7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.187600 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af1eb3bb-1126-4289-9156-a804d676272f" (UID: "af1eb3bb-1126-4289-9156-a804d676272f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.202242 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dz7t\" (UniqueName: \"kubernetes.io/projected/af1eb3bb-1126-4289-9156-a804d676272f-kube-api-access-2dz7t\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.210923 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.212999 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.216725 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.262450 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data" (OuterVolumeSpecName: "config-data") pod "af1eb3bb-1126-4289-9156-a804d676272f" (UID: "af1eb3bb-1126-4289-9156-a804d676272f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.319816 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle\") pod \"473ca677-e1ad-431f-9f9e-c8260e43cda2\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.319940 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lwtt\" (UniqueName: \"kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt\") pod \"473ca677-e1ad-431f-9f9e-c8260e43cda2\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.320111 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data\") pod \"473ca677-e1ad-431f-9f9e-c8260e43cda2\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.320160 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom\") pod \"473ca677-e1ad-431f-9f9e-c8260e43cda2\" (UID: \"473ca677-e1ad-431f-9f9e-c8260e43cda2\") " Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.320894 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1eb3bb-1126-4289-9156-a804d676272f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.347069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "473ca677-e1ad-431f-9f9e-c8260e43cda2" (UID: "473ca677-e1ad-431f-9f9e-c8260e43cda2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.348687 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt" (OuterVolumeSpecName: "kube-api-access-2lwtt") pod "473ca677-e1ad-431f-9f9e-c8260e43cda2" (UID: "473ca677-e1ad-431f-9f9e-c8260e43cda2"). InnerVolumeSpecName "kube-api-access-2lwtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.366923 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerStarted","Data":"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.380165 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" event={"ID":"473ca677-e1ad-431f-9f9e-c8260e43cda2","Type":"ContainerDied","Data":"f2cf62a638222b53d1705e3cd94f3adf3ac38dd5262fad49a3edda7405c22f30"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.380418 4713 scope.go:117] "RemoveContainer" containerID="f8fedfbfb80f728f3a8d00625b66856745d0e0fed6183a5b796a31320f20189d" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.380622 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5955cd59bf-ttt8j" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.403497 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sjxhl" podStartSLOduration=6.376552086 podStartE2EDuration="12.403475619s" podCreationTimestamp="2026-03-14 05:53:52 +0000 UTC" firstStartedPulling="2026-03-14 05:53:56.695329984 +0000 UTC m=+1619.783239284" lastFinishedPulling="2026-03-14 05:54:02.722253517 +0000 UTC m=+1625.810162817" observedRunningTime="2026-03-14 05:54:04.393574043 +0000 UTC m=+1627.481483343" watchObservedRunningTime="2026-03-14 05:54:04.403475619 +0000 UTC m=+1627.491384919" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.422268 4713 generic.go:334] "Generic (PLEG): container finished" podID="17433094-f4d7-4059-a74a-80ee3dff9981" containerID="d677557d6954eec69c34f315885d278c5fcddac2f22ca4cd0cc858e11392e179" exitCode=0 Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.422388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerDied","Data":"d677557d6954eec69c34f315885d278c5fcddac2f22ca4cd0cc858e11392e179"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.426052 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.426086 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lwtt\" (UniqueName: \"kubernetes.io/projected/473ca677-e1ad-431f-9f9e-c8260e43cda2-kube-api-access-2lwtt\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.431990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b896f6bb4-lbjfr" event={"ID":"af1eb3bb-1126-4289-9156-a804d676272f","Type":"ContainerDied","Data":"ba1a97434cb28f52cbba9ead8ac9eb3db263bbd189769e32f0210987e5c0748d"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.432154 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b896f6bb4-lbjfr" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.438457 4713 generic.go:334] "Generic (PLEG): container finished" podID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerID="67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66" exitCode=0 Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.438528 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerDied","Data":"67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.438559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerStarted","Data":"b289290509595683bdea4f31f8ea3bd4ff1a384cca1bc877cba3c4d38fddae77"} Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.513194 4713 scope.go:117] "RemoveContainer" containerID="e9e4f37a635fcaa058bccde83e26d5ad3e7aac7a29f11ee3ee44eed009540686" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.560287 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "473ca677-e1ad-431f-9f9e-c8260e43cda2" (UID: "473ca677-e1ad-431f-9f9e-c8260e43cda2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.583363 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data" (OuterVolumeSpecName: "config-data") pod "473ca677-e1ad-431f-9f9e-c8260e43cda2" (UID: "473ca677-e1ad-431f-9f9e-c8260e43cda2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.640319 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.671267 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.671296 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ca677-e1ad-431f-9f9e-c8260e43cda2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.707647 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-b896f6bb4-lbjfr"] Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.753691 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.766651 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5955cd59bf-ttt8j"] Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.781346 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.887406 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:54:04 crc kubenswrapper[4713]: I0314 05:54:04.887687 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-599cd54c4b-t7gdc" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" containerID="cri-o://793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" gracePeriod=60 Mar 14 05:54:04 crc kubenswrapper[4713]: E0314 05:54:04.906112 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:04 crc kubenswrapper[4713]: E0314 05:54:04.943467 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:04 crc kubenswrapper[4713]: E0314 05:54:04.967321 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:04 crc kubenswrapper[4713]: E0314 05:54:04.967402 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-599cd54c4b-t7gdc" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.048398 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087394 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087458 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf4z2\" (UniqueName: \"kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087651 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087832 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.087898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd\") pod \"17433094-f4d7-4059-a74a-80ee3dff9981\" (UID: \"17433094-f4d7-4059-a74a-80ee3dff9981\") " Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.088869 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.100497 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.103184 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts" (OuterVolumeSpecName: "scripts") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.108758 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2" (OuterVolumeSpecName: "kube-api-access-sf4z2") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "kube-api-access-sf4z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.190414 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.190448 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf4z2\" (UniqueName: \"kubernetes.io/projected/17433094-f4d7-4059-a74a-80ee3dff9981-kube-api-access-sf4z2\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.190463 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.190472 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17433094-f4d7-4059-a74a-80ee3dff9981-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.298149 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data" (OuterVolumeSpecName: "config-data") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.325388 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.331560 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17433094-f4d7-4059-a74a-80ee3dff9981" (UID: "17433094-f4d7-4059-a74a-80ee3dff9981"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.395224 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.395501 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.395590 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17433094-f4d7-4059-a74a-80ee3dff9981-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.464136 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557794-zhr59" event={"ID":"d80784b9-9a04-4aca-9515-d9540532b039","Type":"ContainerStarted","Data":"d179be8d99d176a187710044c0e144849dff020f4f067946d4be94af6f17ba4a"} Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.486569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17433094-f4d7-4059-a74a-80ee3dff9981","Type":"ContainerDied","Data":"7f674ed356ffe7f3da5f77e31d004959e7db65473f0ec1126f53966463fa7f2b"} Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.486619 4713 scope.go:117] "RemoveContainer" containerID="e7c8c4c5aabaa4465f3f3275843379f8eeb6f17ee04cf8710e9871f8a6c4e478" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.486739 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.498586 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557794-zhr59" podStartSLOduration=4.062928471 podStartE2EDuration="5.498560291s" podCreationTimestamp="2026-03-14 05:54:00 +0000 UTC" firstStartedPulling="2026-03-14 05:54:01.565896464 +0000 UTC m=+1624.653805764" lastFinishedPulling="2026-03-14 05:54:03.001528284 +0000 UTC m=+1626.089437584" observedRunningTime="2026-03-14 05:54:05.485136982 +0000 UTC m=+1628.573046282" watchObservedRunningTime="2026-03-14 05:54:05.498560291 +0000 UTC m=+1628.586469611" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.601680 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" path="/var/lib/kubelet/pods/473ca677-e1ad-431f-9f9e-c8260e43cda2/volumes" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.602326 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1eb3bb-1126-4289-9156-a804d676272f" path="/var/lib/kubelet/pods/af1eb3bb-1126-4289-9156-a804d676272f/volumes" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.602918 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.608543 4713 scope.go:117] "RemoveContainer" containerID="08233fd3048ee585cd90b9e5a2f61b6e70c48aedef235f462fe9c615c3fa08b4" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.618117 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.642523 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643236 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643253 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643281 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-notification-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643289 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-notification-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643309 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="sg-core" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643317 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="sg-core" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643335 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-central-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643343 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-central-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643376 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="proxy-httpd" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643383 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="proxy-httpd" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643398 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643437 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.643459 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643474 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643746 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643761 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-notification-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643771 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="proxy-httpd" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643797 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1eb3bb-1126-4289-9156-a804d676272f" containerName="heat-api" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643824 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="ceilometer-central-agent" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643853 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643872 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.643882 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" containerName="sg-core" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.644135 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.644148 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ca677-e1ad-431f-9f9e-c8260e43cda2" containerName="heat-cfnapi" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.646989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.650259 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.650527 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.668646 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.683392 4713 scope.go:117] "RemoveContainer" containerID="c0244abe004980ad0b6eaa29b9f687185e922aad276605d0df780d7e84cc702d" Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.683547 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.693521 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.710354 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:05 crc kubenswrapper[4713]: E0314 05:54:05.710441 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-599cd54c4b-t7gdc" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.747404 4713 scope.go:117] "RemoveContainer" containerID="d677557d6954eec69c34f315885d278c5fcddac2f22ca4cd0cc858e11392e179" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809762 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bcf\" (UniqueName: \"kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809828 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.809987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.810009 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.815692 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.913644 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.913751 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.913861 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.913887 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.914054 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bcf\" (UniqueName: \"kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.914091 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.914110 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.917002 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.923737 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.925282 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.932107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.932831 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.938482 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.947159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bcf\" (UniqueName: \"kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf\") pod \"ceilometer-0\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " pod="openstack/ceilometer-0" Mar 14 05:54:05 crc kubenswrapper[4713]: I0314 05:54:05.975926 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:06 crc kubenswrapper[4713]: I0314 05:54:06.563303 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerStarted","Data":"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45"} Mar 14 05:54:06 crc kubenswrapper[4713]: W0314 05:54:06.853036 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod182fef0d_d2fa_4bb4_8d12_d694c42247af.slice/crio-9872624a77e87016ee925480b14926451dfc5bb6c23ad907f8bb25840ad2abc1 WatchSource:0}: Error finding container 9872624a77e87016ee925480b14926451dfc5bb6c23ad907f8bb25840ad2abc1: Status 404 returned error can't find the container with id 9872624a77e87016ee925480b14926451dfc5bb6c23ad907f8bb25840ad2abc1 Mar 14 05:54:06 crc kubenswrapper[4713]: I0314 05:54:06.899147 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:07 crc kubenswrapper[4713]: I0314 05:54:07.595542 4713 generic.go:334] "Generic (PLEG): container finished" podID="d80784b9-9a04-4aca-9515-d9540532b039" containerID="d179be8d99d176a187710044c0e144849dff020f4f067946d4be94af6f17ba4a" exitCode=0 Mar 14 05:54:07 crc kubenswrapper[4713]: I0314 05:54:07.599613 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17433094-f4d7-4059-a74a-80ee3dff9981" path="/var/lib/kubelet/pods/17433094-f4d7-4059-a74a-80ee3dff9981/volumes" Mar 14 05:54:07 crc kubenswrapper[4713]: I0314 05:54:07.600421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerStarted","Data":"9872624a77e87016ee925480b14926451dfc5bb6c23ad907f8bb25840ad2abc1"} Mar 14 05:54:07 crc kubenswrapper[4713]: I0314 05:54:07.600451 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557794-zhr59" event={"ID":"d80784b9-9a04-4aca-9515-d9540532b039","Type":"ContainerDied","Data":"d179be8d99d176a187710044c0e144849dff020f4f067946d4be94af6f17ba4a"} Mar 14 05:54:08 crc kubenswrapper[4713]: I0314 05:54:08.609774 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerStarted","Data":"d57ca0cebf20a48b0d49d602f7583e3194bdaef5fc6cfc1b94e11b83adfbf0de"} Mar 14 05:54:08 crc kubenswrapper[4713]: I0314 05:54:08.610349 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerStarted","Data":"cc3cd3c5ea90e3ece22119b03bd5f1869bb745fa3417d1e2715c881f45000966"} Mar 14 05:54:08 crc kubenswrapper[4713]: I0314 05:54:08.612941 4713 generic.go:334] "Generic (PLEG): container finished" podID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerID="f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45" exitCode=0 Mar 14 05:54:08 crc kubenswrapper[4713]: I0314 05:54:08.613102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerDied","Data":"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45"} Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.228988 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.360603 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkv4g\" (UniqueName: \"kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g\") pod \"d80784b9-9a04-4aca-9515-d9540532b039\" (UID: \"d80784b9-9a04-4aca-9515-d9540532b039\") " Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.370571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g" (OuterVolumeSpecName: "kube-api-access-gkv4g") pod "d80784b9-9a04-4aca-9515-d9540532b039" (UID: "d80784b9-9a04-4aca-9515-d9540532b039"). InnerVolumeSpecName "kube-api-access-gkv4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.464455 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkv4g\" (UniqueName: \"kubernetes.io/projected/d80784b9-9a04-4aca-9515-d9540532b039-kube-api-access-gkv4g\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.653648 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557794-zhr59" event={"ID":"d80784b9-9a04-4aca-9515-d9540532b039","Type":"ContainerDied","Data":"b55c4a54b9e94f86d4b851550a2a048b7f5dbfce173622b1b0dd693b70669088"} Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.653701 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55c4a54b9e94f86d4b851550a2a048b7f5dbfce173622b1b0dd693b70669088" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.653788 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557794-zhr59" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.669611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerStarted","Data":"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc"} Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.685425 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerStarted","Data":"a01bdcb570d654991263d4943413a2903718138bc9c7e8b14e1815d2fff66d52"} Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.728643 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzqjx" podStartSLOduration=4.10267161 podStartE2EDuration="8.728615409s" podCreationTimestamp="2026-03-14 05:54:01 +0000 UTC" firstStartedPulling="2026-03-14 05:54:04.513640547 +0000 UTC m=+1627.601549847" lastFinishedPulling="2026-03-14 05:54:09.139584346 +0000 UTC m=+1632.227493646" observedRunningTime="2026-03-14 05:54:09.695093031 +0000 UTC m=+1632.783002331" watchObservedRunningTime="2026-03-14 05:54:09.728615409 +0000 UTC m=+1632.816524719" Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.763347 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557788-72k4j"] Mar 14 05:54:09 crc kubenswrapper[4713]: I0314 05:54:09.788020 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557788-72k4j"] Mar 14 05:54:11 crc kubenswrapper[4713]: I0314 05:54:11.482994 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:11 crc kubenswrapper[4713]: I0314 05:54:11.578334 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046a191f-1297-43e4-ad80-9cfdad08202b" path="/var/lib/kubelet/pods/046a191f-1297-43e4-ad80-9cfdad08202b/volumes" Mar 14 05:54:12 crc kubenswrapper[4713]: I0314 05:54:12.264786 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:12 crc kubenswrapper[4713]: I0314 05:54:12.264939 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:12 crc kubenswrapper[4713]: I0314 05:54:12.632984 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:12 crc kubenswrapper[4713]: I0314 05:54:12.633451 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:13 crc kubenswrapper[4713]: I0314 05:54:13.333659 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dzqjx" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" probeResult="failure" output=< Mar 14 05:54:13 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:54:13 crc kubenswrapper[4713]: > Mar 14 05:54:13 crc kubenswrapper[4713]: E0314 05:54:13.623724 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:13 crc kubenswrapper[4713]: I0314 05:54:13.702837 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sjxhl" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="registry-server" probeResult="failure" output=< Mar 14 05:54:13 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:54:13 crc kubenswrapper[4713]: > Mar 14 05:54:14 crc kubenswrapper[4713]: I0314 05:54:14.780147 4713 generic.go:334] "Generic (PLEG): container finished" podID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" exitCode=0 Mar 14 05:54:14 crc kubenswrapper[4713]: I0314 05:54:14.780340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-599cd54c4b-t7gdc" event={"ID":"f1d1368b-65d8-43fa-8025-0c41a7d0dd14","Type":"ContainerDied","Data":"793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19"} Mar 14 05:54:15 crc kubenswrapper[4713]: E0314 05:54:15.663133 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19 is running failed: container process not found" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:15 crc kubenswrapper[4713]: E0314 05:54:15.663952 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19 is running failed: container process not found" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:15 crc kubenswrapper[4713]: E0314 05:54:15.664391 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19 is running failed: container process not found" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:54:15 crc kubenswrapper[4713]: E0314 05:54:15.664454 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-599cd54c4b-t7gdc" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:17 crc kubenswrapper[4713]: E0314 05:54:17.606768 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.246972 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.332867 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle\") pod \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.332954 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom\") pod \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.333268 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsctc\" (UniqueName: \"kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc\") pod \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.333317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data\") pod \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\" (UID: \"f1d1368b-65d8-43fa-8025-0c41a7d0dd14\") " Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.337460 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f1d1368b-65d8-43fa-8025-0c41a7d0dd14" (UID: "f1d1368b-65d8-43fa-8025-0c41a7d0dd14"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.338063 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc" (OuterVolumeSpecName: "kube-api-access-hsctc") pod "f1d1368b-65d8-43fa-8025-0c41a7d0dd14" (UID: "f1d1368b-65d8-43fa-8025-0c41a7d0dd14"). InnerVolumeSpecName "kube-api-access-hsctc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.378768 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1d1368b-65d8-43fa-8025-0c41a7d0dd14" (UID: "f1d1368b-65d8-43fa-8025-0c41a7d0dd14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.445856 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.445895 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.445906 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsctc\" (UniqueName: \"kubernetes.io/projected/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-kube-api-access-hsctc\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.447105 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data" (OuterVolumeSpecName: "config-data") pod "f1d1368b-65d8-43fa-8025-0c41a7d0dd14" (UID: "f1d1368b-65d8-43fa-8025-0c41a7d0dd14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.549535 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d1368b-65d8-43fa-8025-0c41a7d0dd14-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerStarted","Data":"c865cc39277c94db5a32fe57ebcfa0a90b55fdeb4f102adb7c87e143e966402e"} Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890491 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-central-agent" containerID="cri-o://cc3cd3c5ea90e3ece22119b03bd5f1869bb745fa3417d1e2715c881f45000966" gracePeriod=30 Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890783 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890815 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="proxy-httpd" containerID="cri-o://c865cc39277c94db5a32fe57ebcfa0a90b55fdeb4f102adb7c87e143e966402e" gracePeriod=30 Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890872 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-notification-agent" containerID="cri-o://d57ca0cebf20a48b0d49d602f7583e3194bdaef5fc6cfc1b94e11b83adfbf0de" gracePeriod=30 Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.890909 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="sg-core" containerID="cri-o://a01bdcb570d654991263d4943413a2903718138bc9c7e8b14e1815d2fff66d52" gracePeriod=30 Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.894826 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-599cd54c4b-t7gdc" event={"ID":"f1d1368b-65d8-43fa-8025-0c41a7d0dd14","Type":"ContainerDied","Data":"94cd47374e3af312227561edd10dda6fb06d67ac93c8e4acea0005231c3e470b"} Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.895080 4713 scope.go:117] "RemoveContainer" containerID="793381cb9ce08cc142e550e65d4af35c0da352d299c9a103508483dcb750ff19" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.895352 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-599cd54c4b-t7gdc" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.899165 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" event={"ID":"0cd56fa6-325c-4813-bffe-a2cd3bf82257","Type":"ContainerStarted","Data":"2506a986b947dd8202511e0a11956aa33b328bb708c1d658cc22810b613a5d5e"} Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.919627 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.791610281 podStartE2EDuration="14.91960316s" podCreationTimestamp="2026-03-14 05:54:05 +0000 UTC" firstStartedPulling="2026-03-14 05:54:06.865721088 +0000 UTC m=+1629.953630388" lastFinishedPulling="2026-03-14 05:54:18.993713957 +0000 UTC m=+1642.081623267" observedRunningTime="2026-03-14 05:54:19.91556418 +0000 UTC m=+1643.003473490" watchObservedRunningTime="2026-03-14 05:54:19.91960316 +0000 UTC m=+1643.007512460" Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.964306 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.982491 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-599cd54c4b-t7gdc"] Mar 14 05:54:19 crc kubenswrapper[4713]: I0314 05:54:19.995656 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" podStartSLOduration=3.342915985 podStartE2EDuration="24.995634091s" podCreationTimestamp="2026-03-14 05:53:55 +0000 UTC" firstStartedPulling="2026-03-14 05:53:57.340914038 +0000 UTC m=+1620.428823338" lastFinishedPulling="2026-03-14 05:54:18.993632144 +0000 UTC m=+1642.081541444" observedRunningTime="2026-03-14 05:54:19.954501511 +0000 UTC m=+1643.042410811" watchObservedRunningTime="2026-03-14 05:54:19.995634091 +0000 UTC m=+1643.083543391" Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.914747 4713 generic.go:334] "Generic (PLEG): container finished" podID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerID="a01bdcb570d654991263d4943413a2903718138bc9c7e8b14e1815d2fff66d52" exitCode=2 Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.915870 4713 generic.go:334] "Generic (PLEG): container finished" podID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerID="d57ca0cebf20a48b0d49d602f7583e3194bdaef5fc6cfc1b94e11b83adfbf0de" exitCode=0 Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.915937 4713 generic.go:334] "Generic (PLEG): container finished" podID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerID="cc3cd3c5ea90e3ece22119b03bd5f1869bb745fa3417d1e2715c881f45000966" exitCode=0 Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.914821 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerDied","Data":"a01bdcb570d654991263d4943413a2903718138bc9c7e8b14e1815d2fff66d52"} Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.916125 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerDied","Data":"d57ca0cebf20a48b0d49d602f7583e3194bdaef5fc6cfc1b94e11b83adfbf0de"} Mar 14 05:54:20 crc kubenswrapper[4713]: I0314 05:54:20.916198 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerDied","Data":"cc3cd3c5ea90e3ece22119b03bd5f1869bb745fa3417d1e2715c881f45000966"} Mar 14 05:54:21 crc kubenswrapper[4713]: I0314 05:54:21.578146 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" path="/var/lib/kubelet/pods/f1d1368b-65d8-43fa-8025-0c41a7d0dd14/volumes" Mar 14 05:54:22 crc kubenswrapper[4713]: I0314 05:54:22.688359 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:22 crc kubenswrapper[4713]: I0314 05:54:22.743801 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:23 crc kubenswrapper[4713]: I0314 05:54:23.327523 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dzqjx" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" probeResult="failure" output=< Mar 14 05:54:23 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 05:54:23 crc kubenswrapper[4713]: > Mar 14 05:54:23 crc kubenswrapper[4713]: I0314 05:54:23.489826 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:54:23 crc kubenswrapper[4713]: I0314 05:54:23.950050 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sjxhl" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="registry-server" containerID="cri-o://476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a" gracePeriod=2 Mar 14 05:54:23 crc kubenswrapper[4713]: E0314 05:54:23.962473 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.611677 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.699652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content\") pod \"0aeae175-1650-4587-b866-00a6d082a849\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.699834 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2fsz\" (UniqueName: \"kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz\") pod \"0aeae175-1650-4587-b866-00a6d082a849\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.700028 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities\") pod \"0aeae175-1650-4587-b866-00a6d082a849\" (UID: \"0aeae175-1650-4587-b866-00a6d082a849\") " Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.701424 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities" (OuterVolumeSpecName: "utilities") pod "0aeae175-1650-4587-b866-00a6d082a849" (UID: "0aeae175-1650-4587-b866-00a6d082a849"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.715093 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz" (OuterVolumeSpecName: "kube-api-access-q2fsz") pod "0aeae175-1650-4587-b866-00a6d082a849" (UID: "0aeae175-1650-4587-b866-00a6d082a849"). InnerVolumeSpecName "kube-api-access-q2fsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.773097 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0aeae175-1650-4587-b866-00a6d082a849" (UID: "0aeae175-1650-4587-b866-00a6d082a849"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.803641 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.803728 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aeae175-1650-4587-b866-00a6d082a849-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.803746 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2fsz\" (UniqueName: \"kubernetes.io/projected/0aeae175-1650-4587-b866-00a6d082a849-kube-api-access-q2fsz\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.965981 4713 generic.go:334] "Generic (PLEG): container finished" podID="0aeae175-1650-4587-b866-00a6d082a849" containerID="476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a" exitCode=0 Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.966029 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerDied","Data":"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a"} Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.966046 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjxhl" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.966071 4713 scope.go:117] "RemoveContainer" containerID="476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a" Mar 14 05:54:24 crc kubenswrapper[4713]: I0314 05:54:24.966057 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjxhl" event={"ID":"0aeae175-1650-4587-b866-00a6d082a849","Type":"ContainerDied","Data":"1c504176b75d6dcb307133ad04c70aa8e7fb3e15ede67e83941e8f7a59716906"} Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.015315 4713 scope.go:117] "RemoveContainer" containerID="b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.015325 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.032822 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sjxhl"] Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.048080 4713 scope.go:117] "RemoveContainer" containerID="7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.108481 4713 scope.go:117] "RemoveContainer" containerID="476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a" Mar 14 05:54:25 crc kubenswrapper[4713]: E0314 05:54:25.109048 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a\": container with ID starting with 476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a not found: ID does not exist" containerID="476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.109090 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a"} err="failed to get container status \"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a\": rpc error: code = NotFound desc = could not find container \"476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a\": container with ID starting with 476447a7df4e59af23eb0a879c3044ed771af978e0a09f82028f4e48e415314a not found: ID does not exist" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.109115 4713 scope.go:117] "RemoveContainer" containerID="b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef" Mar 14 05:54:25 crc kubenswrapper[4713]: E0314 05:54:25.109404 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef\": container with ID starting with b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef not found: ID does not exist" containerID="b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.109433 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef"} err="failed to get container status \"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef\": rpc error: code = NotFound desc = could not find container \"b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef\": container with ID starting with b1f14716637c9ac5e3a1668b8794a6382aaa535b9f75a83375003c7c73a91aef not found: ID does not exist" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.109453 4713 scope.go:117] "RemoveContainer" containerID="7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea" Mar 14 05:54:25 crc kubenswrapper[4713]: E0314 05:54:25.109849 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea\": container with ID starting with 7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea not found: ID does not exist" containerID="7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.109885 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea"} err="failed to get container status \"7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea\": rpc error: code = NotFound desc = could not find container \"7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea\": container with ID starting with 7430adca3fcfe6dd97d5e82811e1085dcb25ee8e286ad35035874a3fa0020dea not found: ID does not exist" Mar 14 05:54:25 crc kubenswrapper[4713]: I0314 05:54:25.578200 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aeae175-1650-4587-b866-00a6d082a849" path="/var/lib/kubelet/pods/0aeae175-1650-4587-b866-00a6d082a849/volumes" Mar 14 05:54:32 crc kubenswrapper[4713]: I0314 05:54:32.056656 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cd56fa6-325c-4813-bffe-a2cd3bf82257" containerID="2506a986b947dd8202511e0a11956aa33b328bb708c1d658cc22810b613a5d5e" exitCode=0 Mar 14 05:54:32 crc kubenswrapper[4713]: I0314 05:54:32.056738 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" event={"ID":"0cd56fa6-325c-4813-bffe-a2cd3bf82257","Type":"ContainerDied","Data":"2506a986b947dd8202511e0a11956aa33b328bb708c1d658cc22810b613a5d5e"} Mar 14 05:54:32 crc kubenswrapper[4713]: I0314 05:54:32.339726 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:32 crc kubenswrapper[4713]: I0314 05:54:32.397374 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:32 crc kubenswrapper[4713]: E0314 05:54:32.827473 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.129816 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.569276 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.617282 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxn9\" (UniqueName: \"kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9\") pod \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.617374 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle\") pod \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.617567 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts\") pod \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.617591 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data\") pod \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\" (UID: \"0cd56fa6-325c-4813-bffe-a2cd3bf82257\") " Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.624104 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts" (OuterVolumeSpecName: "scripts") pod "0cd56fa6-325c-4813-bffe-a2cd3bf82257" (UID: "0cd56fa6-325c-4813-bffe-a2cd3bf82257"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.624896 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9" (OuterVolumeSpecName: "kube-api-access-5dxn9") pod "0cd56fa6-325c-4813-bffe-a2cd3bf82257" (UID: "0cd56fa6-325c-4813-bffe-a2cd3bf82257"). InnerVolumeSpecName "kube-api-access-5dxn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.656535 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data" (OuterVolumeSpecName: "config-data") pod "0cd56fa6-325c-4813-bffe-a2cd3bf82257" (UID: "0cd56fa6-325c-4813-bffe-a2cd3bf82257"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.671747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cd56fa6-325c-4813-bffe-a2cd3bf82257" (UID: "0cd56fa6-325c-4813-bffe-a2cd3bf82257"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.723278 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dxn9\" (UniqueName: \"kubernetes.io/projected/0cd56fa6-325c-4813-bffe-a2cd3bf82257-kube-api-access-5dxn9\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.723445 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.723464 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:33 crc kubenswrapper[4713]: I0314 05:54:33.723474 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd56fa6-325c-4813-bffe-a2cd3bf82257-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.015037 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.094314 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzqjx" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" containerID="cri-o://11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc" gracePeriod=2 Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.094775 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.094836 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrsf8" event={"ID":"0cd56fa6-325c-4813-bffe-a2cd3bf82257","Type":"ContainerDied","Data":"025ccdd158213340d3559a8af8b26bc234ed78ab1c3e94d4e6df69a8e584d005"} Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.094872 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025ccdd158213340d3559a8af8b26bc234ed78ab1c3e94d4e6df69a8e584d005" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.214783 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215267 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="extract-content" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215280 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="extract-content" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215295 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="extract-utilities" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215302 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="extract-utilities" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215313 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215319 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215327 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="registry-server" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215332 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="registry-server" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215484 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80784b9-9a04-4aca-9515-d9540532b039" containerName="oc" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215493 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80784b9-9a04-4aca-9515-d9540532b039" containerName="oc" Mar 14 05:54:34 crc kubenswrapper[4713]: E0314 05:54:34.215509 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd56fa6-325c-4813-bffe-a2cd3bf82257" containerName="nova-cell0-conductor-db-sync" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215515 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd56fa6-325c-4813-bffe-a2cd3bf82257" containerName="nova-cell0-conductor-db-sync" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215852 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeae175-1650-4587-b866-00a6d082a849" containerName="registry-server" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215868 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d1368b-65d8-43fa-8025-0c41a7d0dd14" containerName="heat-engine" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215922 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80784b9-9a04-4aca-9515-d9540532b039" containerName="oc" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.215936 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd56fa6-325c-4813-bffe-a2cd3bf82257" containerName="nova-cell0-conductor-db-sync" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.216748 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.231102 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2xfb9" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.231314 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.237801 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.250133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmft\" (UniqueName: \"kubernetes.io/projected/d88f915d-d72f-4586-b435-67d75d24ecc0-kube-api-access-5pmft\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.250200 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.250391 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.353478 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmft\" (UniqueName: \"kubernetes.io/projected/d88f915d-d72f-4586-b435-67d75d24ecc0-kube-api-access-5pmft\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.353555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.353582 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.359171 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.359279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88f915d-d72f-4586-b435-67d75d24ecc0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.373963 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmft\" (UniqueName: \"kubernetes.io/projected/d88f915d-d72f-4586-b435-67d75d24ecc0-kube-api-access-5pmft\") pod \"nova-cell0-conductor-0\" (UID: \"d88f915d-d72f-4586-b435-67d75d24ecc0\") " pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.590768 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.781832 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.875600 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4znlt\" (UniqueName: \"kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt\") pod \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.875671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities\") pod \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.875705 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content\") pod \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\" (UID: \"8711e2a1-6795-4c17-afad-a95c01fbf5f9\") " Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.876505 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities" (OuterVolumeSpecName: "utilities") pod "8711e2a1-6795-4c17-afad-a95c01fbf5f9" (UID: "8711e2a1-6795-4c17-afad-a95c01fbf5f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.883861 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt" (OuterVolumeSpecName: "kube-api-access-4znlt") pod "8711e2a1-6795-4c17-afad-a95c01fbf5f9" (UID: "8711e2a1-6795-4c17-afad-a95c01fbf5f9"). InnerVolumeSpecName "kube-api-access-4znlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.936902 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8711e2a1-6795-4c17-afad-a95c01fbf5f9" (UID: "8711e2a1-6795-4c17-afad-a95c01fbf5f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.978869 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4znlt\" (UniqueName: \"kubernetes.io/projected/8711e2a1-6795-4c17-afad-a95c01fbf5f9-kube-api-access-4znlt\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.978918 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:34 crc kubenswrapper[4713]: I0314 05:54:34.978929 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8711e2a1-6795-4c17-afad-a95c01fbf5f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.101132 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.112320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d88f915d-d72f-4586-b435-67d75d24ecc0","Type":"ContainerStarted","Data":"5def21c5098b276a966c6bc4293f567e8e99ac638246bb5637e64df98bedb497"} Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.115606 4713 generic.go:334] "Generic (PLEG): container finished" podID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerID="11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc" exitCode=0 Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.115652 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerDied","Data":"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc"} Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.115689 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzqjx" event={"ID":"8711e2a1-6795-4c17-afad-a95c01fbf5f9","Type":"ContainerDied","Data":"b289290509595683bdea4f31f8ea3bd4ff1a384cca1bc877cba3c4d38fddae77"} Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.115720 4713 scope.go:117] "RemoveContainer" containerID="11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.115954 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzqjx" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.172931 4713 scope.go:117] "RemoveContainer" containerID="f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.186221 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.196778 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzqjx"] Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.202375 4713 scope.go:117] "RemoveContainer" containerID="67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.226829 4713 scope.go:117] "RemoveContainer" containerID="11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc" Mar 14 05:54:35 crc kubenswrapper[4713]: E0314 05:54:35.227338 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc\": container with ID starting with 11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc not found: ID does not exist" containerID="11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.227376 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc"} err="failed to get container status \"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc\": rpc error: code = NotFound desc = could not find container \"11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc\": container with ID starting with 11acda4ddd056171db14a06ed29d64510d36b2a9cca8e8532e2cc454863c80bc not found: ID does not exist" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.227427 4713 scope.go:117] "RemoveContainer" containerID="f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45" Mar 14 05:54:35 crc kubenswrapper[4713]: E0314 05:54:35.227785 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45\": container with ID starting with f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45 not found: ID does not exist" containerID="f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.227823 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45"} err="failed to get container status \"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45\": rpc error: code = NotFound desc = could not find container \"f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45\": container with ID starting with f386436a203e4228902c6cde9fb1224049af71c681a86a4f2455db2f0722ff45 not found: ID does not exist" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.227861 4713 scope.go:117] "RemoveContainer" containerID="67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66" Mar 14 05:54:35 crc kubenswrapper[4713]: E0314 05:54:35.228092 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66\": container with ID starting with 67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66 not found: ID does not exist" containerID="67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.228129 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66"} err="failed to get container status \"67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66\": rpc error: code = NotFound desc = could not find container \"67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66\": container with ID starting with 67e4138a6b3e8cc8966ab7e19f265ef2affa8a0ce2d5217ace79737dda4b7e66 not found: ID does not exist" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.578348 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" path="/var/lib/kubelet/pods/8711e2a1-6795-4c17-afad-a95c01fbf5f9/volumes" Mar 14 05:54:35 crc kubenswrapper[4713]: I0314 05:54:35.984363 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 05:54:36 crc kubenswrapper[4713]: I0314 05:54:36.131409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d88f915d-d72f-4586-b435-67d75d24ecc0","Type":"ContainerStarted","Data":"e2637fcd7c1435ef6183cf9244eb1946ec4be6fb4630ad994304cc5e71e84b24"} Mar 14 05:54:36 crc kubenswrapper[4713]: I0314 05:54:36.131541 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:36 crc kubenswrapper[4713]: I0314 05:54:36.158052 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.158034087 podStartE2EDuration="2.158034087s" podCreationTimestamp="2026-03-14 05:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:54:36.145149497 +0000 UTC m=+1659.233058807" watchObservedRunningTime="2026-03-14 05:54:36.158034087 +0000 UTC m=+1659.245943387" Mar 14 05:54:40 crc kubenswrapper[4713]: I0314 05:54:40.732056 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:54:40 crc kubenswrapper[4713]: I0314 05:54:40.732495 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:54:44 crc kubenswrapper[4713]: E0314 05:54:44.344715 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:44 crc kubenswrapper[4713]: I0314 05:54:44.619780 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 05:54:44 crc kubenswrapper[4713]: I0314 05:54:44.774918 4713 scope.go:117] "RemoveContainer" containerID="fe4980687f600d8db3607ee143d285ff117a4fb69855303a300c79f037e46b27" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.168965 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ncptd"] Mar 14 05:54:45 crc kubenswrapper[4713]: E0314 05:54:45.169471 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="extract-content" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.169488 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="extract-content" Mar 14 05:54:45 crc kubenswrapper[4713]: E0314 05:54:45.169501 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.169507 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" Mar 14 05:54:45 crc kubenswrapper[4713]: E0314 05:54:45.169518 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="extract-utilities" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.169525 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="extract-utilities" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.169752 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8711e2a1-6795-4c17-afad-a95c01fbf5f9" containerName="registry-server" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.172484 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.174961 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.175239 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.192359 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ncptd"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.237265 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.237783 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.238018 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42kc\" (UniqueName: \"kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.238124 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.350248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.350873 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42kc\" (UniqueName: \"kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.350966 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.351091 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.362785 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.390713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.398542 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.411979 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.423057 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.426329 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.438091 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42kc\" (UniqueName: \"kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc\") pod \"nova-cell0-cell-mapping-ncptd\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.454311 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.454484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.454600 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rgc\" (UniqueName: \"kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.454648 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.502460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.504159 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.555896 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.556076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rgc\" (UniqueName: \"kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.556131 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.556230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.557370 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.561989 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.563880 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.568960 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.575355 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rgc\" (UniqueName: \"kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.580311 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.580537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data\") pod \"nova-api-0\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.626099 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.632562 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.641640 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.642018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.646567 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.658953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.661373 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwqf\" (UniqueName: \"kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.661633 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.661718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6rfw\" (UniqueName: \"kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.661869 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.661987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.667439 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.669419 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.678948 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.687305 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.756967 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.760532 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764532 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnsk\" (UniqueName: \"kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764582 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764668 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmnp\" (UniqueName: \"kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764726 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwqf\" (UniqueName: \"kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764793 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.764990 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765034 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765054 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765069 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6rfw\" (UniqueName: \"kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765134 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765162 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765198 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765400 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.765426 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.772317 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.772990 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.774804 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.781912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.789279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.793388 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwqf\" (UniqueName: \"kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.793885 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6rfw\" (UniqueName: \"kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw\") pod \"nova-scheduler-0\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " pod="openstack/nova-scheduler-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.841675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.867779 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.867872 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.867971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868062 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868245 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnsk\" (UniqueName: \"kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868339 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868360 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmnp\" (UniqueName: \"kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.868446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.870860 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.871850 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.872697 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.875073 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.875204 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.876177 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.888558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.893071 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnsk\" (UniqueName: \"kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk\") pod \"dnsmasq-dns-568d7fd7cf-gm7d5\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.896412 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmnp\" (UniqueName: \"kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:45 crc kubenswrapper[4713]: I0314 05:54:45.897064 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " pod="openstack/nova-metadata-0" Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.012144 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.029070 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.038128 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.084049 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.336239 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ncptd"] Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.432794 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ncptd" event={"ID":"31b3a4b9-569c-4648-88bc-377d83007c53","Type":"ContainerStarted","Data":"9e22fd6542bd80552afd9dd1465a7cedbcb4bad5f1078e12b59037fe007bca71"} Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.670714 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.925784 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:54:46 crc kubenswrapper[4713]: I0314 05:54:46.942902 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.180892 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.203258 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.478564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerStarted","Data":"a4f65b6d0e4fcf336d980c383113117a6597b6acd15c7e994ef6398b7498fc10"} Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.483755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerStarted","Data":"7510b5e483f88cf69e6cd298e9f63e2524cf33fbb50abf1224363847aca55b7e"} Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.506013 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xmbd7"] Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.508955 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.511445 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ncptd" event={"ID":"31b3a4b9-569c-4648-88bc-377d83007c53","Type":"ContainerStarted","Data":"c32f927083e23fa1d7f4747055af85bb344d63c376f173f695f534ad96517aaf"} Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.515772 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.519516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" event={"ID":"0266f913-2a1b-401a-aaaa-720ece998a13","Type":"ContainerStarted","Data":"7af120b1fc2e7497573a36eb0f1a40367ecb0dee0e050950983259a536d10ddd"} Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.519731 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.529063 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.529498 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.529526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwblv\" (UniqueName: \"kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.529601 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.546011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb040c49-5ad0-4283-a5cc-dc40275b5284","Type":"ContainerStarted","Data":"d87fdbe1a242c70be1995e890e7a2005c6c37db76113d065dfb9542df19d65be"} Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.559348 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xmbd7"] Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.560305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5b498d2-114c-44f2-b2d2-681d4bedbdbb","Type":"ContainerStarted","Data":"05bb35f238ecf88a35fb71efaf088916a76fc7253983258103fd40e23c03c1fd"} Mar 14 05:54:47 crc kubenswrapper[4713]: E0314 05:54:47.634097 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.636062 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ncptd" podStartSLOduration=2.636033991 podStartE2EDuration="2.636033991s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:54:47.559325659 +0000 UTC m=+1670.647234949" watchObservedRunningTime="2026-03-14 05:54:47.636033991 +0000 UTC m=+1670.723943291" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.641912 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.642059 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.642090 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwblv\" (UniqueName: \"kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.642121 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.655597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.662932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.670935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.678740 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwblv\" (UniqueName: \"kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv\") pod \"nova-cell1-conductor-db-sync-xmbd7\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:47 crc kubenswrapper[4713]: I0314 05:54:47.940570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:54:48 crc kubenswrapper[4713]: E0314 05:54:48.110603 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:48 crc kubenswrapper[4713]: E0314 05:54:48.117781 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:48 crc kubenswrapper[4713]: I0314 05:54:48.588190 4713 generic.go:334] "Generic (PLEG): container finished" podID="0266f913-2a1b-401a-aaaa-720ece998a13" containerID="30193214a23d679629fd346997d39f9f43b3e83bebea357fed4b78c8ef86cf28" exitCode=0 Mar 14 05:54:48 crc kubenswrapper[4713]: I0314 05:54:48.589797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" event={"ID":"0266f913-2a1b-401a-aaaa-720ece998a13","Type":"ContainerDied","Data":"30193214a23d679629fd346997d39f9f43b3e83bebea357fed4b78c8ef86cf28"} Mar 14 05:54:48 crc kubenswrapper[4713]: I0314 05:54:48.591090 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xmbd7"] Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.632817 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.656384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" event={"ID":"0266f913-2a1b-401a-aaaa-720ece998a13","Type":"ContainerStarted","Data":"1a74f361b0e3a1b45ed165c0ab7947b78e4fae621be9591340f7d6066644e6cd"} Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.656713 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.667867 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" event={"ID":"3f9aceec-678a-410d-be9f-e5a6a1116c0b","Type":"ContainerStarted","Data":"a151e8bfc0f807624971a02bebfee38264656b9c81d9f9d16d7de62503783952"} Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.667929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" event={"ID":"3f9aceec-678a-410d-be9f-e5a6a1116c0b","Type":"ContainerStarted","Data":"4af1ed7701a29ff0cc628b3e3efbdd2d3603f98546cd1a28cca1da347380222d"} Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.702023 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.703030 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" podStartSLOduration=4.70301054 podStartE2EDuration="4.70301054s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:54:49.690698908 +0000 UTC m=+1672.778608208" watchObservedRunningTime="2026-03-14 05:54:49.70301054 +0000 UTC m=+1672.790919840" Mar 14 05:54:49 crc kubenswrapper[4713]: I0314 05:54:49.794775 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" podStartSLOduration=2.794751442 podStartE2EDuration="2.794751442s" podCreationTimestamp="2026-03-14 05:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:54:49.7152212 +0000 UTC m=+1672.803130500" watchObservedRunningTime="2026-03-14 05:54:49.794751442 +0000 UTC m=+1672.882660742" Mar 14 05:54:50 crc kubenswrapper[4713]: I0314 05:54:50.683978 4713 generic.go:334] "Generic (PLEG): container finished" podID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerID="c865cc39277c94db5a32fe57ebcfa0a90b55fdeb4f102adb7c87e143e966402e" exitCode=137 Mar 14 05:54:50 crc kubenswrapper[4713]: I0314 05:54:50.685799 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerDied","Data":"c865cc39277c94db5a32fe57ebcfa0a90b55fdeb4f102adb7c87e143e966402e"} Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.011509 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151385 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151475 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6bcf\" (UniqueName: \"kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151635 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151668 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151713 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.151767 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle\") pod \"182fef0d-d2fa-4bb4-8d12-d694c42247af\" (UID: \"182fef0d-d2fa-4bb4-8d12-d694c42247af\") " Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.153252 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.153570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.157103 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts" (OuterVolumeSpecName: "scripts") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.165431 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf" (OuterVolumeSpecName: "kube-api-access-g6bcf") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "kube-api-access-g6bcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.198632 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.255111 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.255150 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/182fef0d-d2fa-4bb4-8d12-d694c42247af-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.255162 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.255171 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.255185 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6bcf\" (UniqueName: \"kubernetes.io/projected/182fef0d-d2fa-4bb4-8d12-d694c42247af-kube-api-access-g6bcf\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.292016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.349239 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data" (OuterVolumeSpecName: "config-data") pod "182fef0d-d2fa-4bb4-8d12-d694c42247af" (UID: "182fef0d-d2fa-4bb4-8d12-d694c42247af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.357792 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.357824 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182fef0d-d2fa-4bb4-8d12-d694c42247af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.705534 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"182fef0d-d2fa-4bb4-8d12-d694c42247af","Type":"ContainerDied","Data":"9872624a77e87016ee925480b14926451dfc5bb6c23ad907f8bb25840ad2abc1"} Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.705592 4713 scope.go:117] "RemoveContainer" containerID="c865cc39277c94db5a32fe57ebcfa0a90b55fdeb4f102adb7c87e143e966402e" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.705676 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.751548 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.778876 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.794263 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:51 crc kubenswrapper[4713]: E0314 05:54:51.794877 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="sg-core" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.794897 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="sg-core" Mar 14 05:54:51 crc kubenswrapper[4713]: E0314 05:54:51.794922 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-notification-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.794930 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-notification-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: E0314 05:54:51.794966 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="proxy-httpd" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.794977 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="proxy-httpd" Mar 14 05:54:51 crc kubenswrapper[4713]: E0314 05:54:51.794993 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-central-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.795003 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-central-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.795327 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-notification-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.795350 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="sg-core" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.795365 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="ceilometer-central-agent" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.795384 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" containerName="proxy-httpd" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.798034 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.804436 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.804576 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.831226 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.870235 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.870384 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.870580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.870946 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.871017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.871045 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbhg5\" (UniqueName: \"kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.871099 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.972985 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973549 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973582 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbhg5\" (UniqueName: \"kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.973798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.974045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.974289 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.985071 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.985291 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.987899 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.992285 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:51 crc kubenswrapper[4713]: I0314 05:54:51.996423 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbhg5\" (UniqueName: \"kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5\") pod \"ceilometer-0\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " pod="openstack/ceilometer-0" Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.127371 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.196100 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.210772 4713 scope.go:117] "RemoveContainer" containerID="a01bdcb570d654991263d4943413a2903718138bc9c7e8b14e1815d2fff66d52" Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.540159 4713 scope.go:117] "RemoveContainer" containerID="d57ca0cebf20a48b0d49d602f7583e3194bdaef5fc6cfc1b94e11b83adfbf0de" Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.588502 4713 scope.go:117] "RemoveContainer" containerID="cc3cd3c5ea90e3ece22119b03bd5f1869bb745fa3417d1e2715c881f45000966" Mar 14 05:54:52 crc kubenswrapper[4713]: I0314 05:54:52.848165 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.587959 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182fef0d-d2fa-4bb4-8d12-d694c42247af" path="/var/lib/kubelet/pods/182fef0d-d2fa-4bb4-8d12-d694c42247af/volumes" Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.734296 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5b498d2-114c-44f2-b2d2-681d4bedbdbb","Type":"ContainerStarted","Data":"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.734479 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c" gracePeriod=30 Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.737648 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerStarted","Data":"20011c95aeccd5bb5f40ebe9c34709dc507e19f8af4c4da321b7588f818b892f"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.737684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerStarted","Data":"0f3531c376316897c55e57c2d74777c7b8cecaa42a079e1a8c89bf10a74f5a03"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.742147 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerStarted","Data":"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.742182 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerStarted","Data":"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.747263 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerStarted","Data":"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.747312 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerStarted","Data":"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.747543 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-metadata" containerID="cri-o://b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337" gracePeriod=30 Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.747552 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-log" containerID="cri-o://d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d" gracePeriod=30 Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.750139 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb040c49-5ad0-4283-a5cc-dc40275b5284","Type":"ContainerStarted","Data":"c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d"} Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.758931 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.491863461 podStartE2EDuration="8.758914231s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="2026-03-14 05:54:46.94529834 +0000 UTC m=+1670.033207640" lastFinishedPulling="2026-03-14 05:54:52.21234911 +0000 UTC m=+1675.300258410" observedRunningTime="2026-03-14 05:54:53.757640911 +0000 UTC m=+1676.845550231" watchObservedRunningTime="2026-03-14 05:54:53.758914231 +0000 UTC m=+1676.846823531" Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.786152 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.0723750499999998 podStartE2EDuration="8.786136008s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="2026-03-14 05:54:47.215161546 +0000 UTC m=+1670.303070846" lastFinishedPulling="2026-03-14 05:54:52.928922504 +0000 UTC m=+1676.016831804" observedRunningTime="2026-03-14 05:54:53.78055204 +0000 UTC m=+1676.868461360" watchObservedRunningTime="2026-03-14 05:54:53.786136008 +0000 UTC m=+1676.874045308" Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.811025 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.514023828 podStartE2EDuration="8.811003311s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="2026-03-14 05:54:46.933540455 +0000 UTC m=+1670.021449755" lastFinishedPulling="2026-03-14 05:54:52.230519938 +0000 UTC m=+1675.318429238" observedRunningTime="2026-03-14 05:54:53.79779126 +0000 UTC m=+1676.885700560" watchObservedRunningTime="2026-03-14 05:54:53.811003311 +0000 UTC m=+1676.898912611" Mar 14 05:54:53 crc kubenswrapper[4713]: I0314 05:54:53.834006 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.563487891 podStartE2EDuration="8.833986923s" podCreationTimestamp="2026-03-14 05:54:45 +0000 UTC" firstStartedPulling="2026-03-14 05:54:46.662966577 +0000 UTC m=+1669.750875887" lastFinishedPulling="2026-03-14 05:54:52.933465619 +0000 UTC m=+1676.021374919" observedRunningTime="2026-03-14 05:54:53.821144674 +0000 UTC m=+1676.909053974" watchObservedRunningTime="2026-03-14 05:54:53.833986923 +0000 UTC m=+1676.921896223" Mar 14 05:54:54 crc kubenswrapper[4713]: E0314 05:54:54.677861 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice/crio-0d1a882f80ccb02b1dd00f6d4e8121191e89f5056c615774a9b840c2bc9cf281\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805c8988_dae7_41ae_8160_75ad28990e12.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:54:54 crc kubenswrapper[4713]: I0314 05:54:54.774197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerStarted","Data":"68dbafeb9318baf2c027b788f3e66dffc0959a917f4bf3aec64bc8a4c9535ef8"} Mar 14 05:54:54 crc kubenswrapper[4713]: I0314 05:54:54.777482 4713 generic.go:334] "Generic (PLEG): container finished" podID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerID="d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d" exitCode=143 Mar 14 05:54:54 crc kubenswrapper[4713]: I0314 05:54:54.777571 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerDied","Data":"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d"} Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.806683 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerStarted","Data":"42f2a5e13832434ef374dcf60639148caadce7c2d8e31df28a354261d6184932"} Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.812735 4713 generic.go:334] "Generic (PLEG): container finished" podID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" containerID="3f498a7d7eab4a99bb1cd3cb28fcb2e2b1bc211f98ac82859d3017abc3c1e54c" exitCode=137 Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.812786 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" event={"ID":"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c","Type":"ContainerDied","Data":"3f498a7d7eab4a99bb1cd3cb28fcb2e2b1bc211f98ac82859d3017abc3c1e54c"} Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.812814 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" event={"ID":"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c","Type":"ContainerDied","Data":"1b46036c4f335b3c4ea4e02cf16db9e96243c7f83e8edd2b5703a07629506cd5"} Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.812825 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b46036c4f335b3c4ea4e02cf16db9e96243c7f83e8edd2b5703a07629506cd5" Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.841065 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.841189 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.907635 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.995046 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle\") pod \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.995229 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data\") pod \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.995266 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt6fn\" (UniqueName: \"kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn\") pod \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " Mar 14 05:54:55 crc kubenswrapper[4713]: I0314 05:54:55.995298 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom\") pod \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\" (UID: \"f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c\") " Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.002386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" (UID: "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.012665 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.015508 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.017066 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn" (OuterVolumeSpecName: "kube-api-access-dt6fn") pod "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" (UID: "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c"). InnerVolumeSpecName "kube-api-access-dt6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.030360 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.033607 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" (UID: "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.069123 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.091261 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.098336 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt6fn\" (UniqueName: \"kubernetes.io/projected/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-kube-api-access-dt6fn\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.098374 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.098383 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.111885 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data" (OuterVolumeSpecName: "config-data") pod "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" (UID: "f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.173004 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.173626 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="dnsmasq-dns" containerID="cri-o://89312b22f73ba4ceb12ed97adaa5487d8bd45d9885947d3935142cdff17922f5" gracePeriod=10 Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.200350 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.835995 4713 generic.go:334] "Generic (PLEG): container finished" podID="31b3a4b9-569c-4648-88bc-377d83007c53" containerID="c32f927083e23fa1d7f4747055af85bb344d63c376f173f695f534ad96517aaf" exitCode=0 Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.836576 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ncptd" event={"ID":"31b3a4b9-569c-4648-88bc-377d83007c53","Type":"ContainerDied","Data":"c32f927083e23fa1d7f4747055af85bb344d63c376f173f695f534ad96517aaf"} Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.842124 4713 generic.go:334] "Generic (PLEG): container finished" podID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerID="89312b22f73ba4ceb12ed97adaa5487d8bd45d9885947d3935142cdff17922f5" exitCode=0 Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.842238 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-684cc7695b-tnj9p" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.842312 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" event={"ID":"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3","Type":"ContainerDied","Data":"89312b22f73ba4ceb12ed97adaa5487d8bd45d9885947d3935142cdff17922f5"} Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.911609 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.927431 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.245:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.927766 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.245:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 05:54:56 crc kubenswrapper[4713]: I0314 05:54:56.985874 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.023719 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.024538 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.024681 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.024773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.024832 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.024928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmt9\" (UniqueName: \"kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.025002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb\") pod \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\" (UID: \"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3\") " Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.067748 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9" (OuterVolumeSpecName: "kube-api-access-gxmt9") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "kube-api-access-gxmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.067766 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-684cc7695b-tnj9p"] Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.103081 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.108544 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.132143 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.132175 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.132184 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmt9\" (UniqueName: \"kubernetes.io/projected/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-kube-api-access-gxmt9\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.142072 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.142720 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config" (OuterVolumeSpecName: "config") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.156478 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" (UID: "13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.234167 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.234196 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.234222 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.586817 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" path="/var/lib/kubelet/pods/f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c/volumes" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.871655 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerStarted","Data":"cfe54626451b26876a8fa637d0196c3a247154ff67988db7f58cf5b1db5d4b1a"} Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.871998 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-central-agent" containerID="cri-o://20011c95aeccd5bb5f40ebe9c34709dc507e19f8af4c4da321b7588f818b892f" gracePeriod=30 Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.872322 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="proxy-httpd" containerID="cri-o://cfe54626451b26876a8fa637d0196c3a247154ff67988db7f58cf5b1db5d4b1a" gracePeriod=30 Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.872345 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.872393 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="sg-core" containerID="cri-o://42f2a5e13832434ef374dcf60639148caadce7c2d8e31df28a354261d6184932" gracePeriod=30 Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.872446 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-notification-agent" containerID="cri-o://68dbafeb9318baf2c027b788f3e66dffc0959a917f4bf3aec64bc8a4c9535ef8" gracePeriod=30 Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.889432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" event={"ID":"13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3","Type":"ContainerDied","Data":"ef0833e1e4a75cc7baf3bc0300fd64574cc078f31a1beae868833c6bf62349b5"} Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.889497 4713 scope.go:117] "RemoveContainer" containerID="89312b22f73ba4ceb12ed97adaa5487d8bd45d9885947d3935142cdff17922f5" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.889465 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-5cxjw" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.896403 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7979005040000002 podStartE2EDuration="6.896386911s" podCreationTimestamp="2026-03-14 05:54:51 +0000 UTC" firstStartedPulling="2026-03-14 05:54:52.904089304 +0000 UTC m=+1675.991998604" lastFinishedPulling="2026-03-14 05:54:57.002575711 +0000 UTC m=+1680.090485011" observedRunningTime="2026-03-14 05:54:57.89256618 +0000 UTC m=+1680.980475480" watchObservedRunningTime="2026-03-14 05:54:57.896386911 +0000 UTC m=+1680.984296211" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.941944 4713 scope.go:117] "RemoveContainer" containerID="e5d52cf69f89e4aed1a905e924a2c925ec8432e1e1e0a66cc6aa84c7e8c2c09a" Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.954401 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:54:57 crc kubenswrapper[4713]: I0314 05:54:57.970170 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-5cxjw"] Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.440860 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.471056 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts\") pod \"31b3a4b9-569c-4648-88bc-377d83007c53\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.471195 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data\") pod \"31b3a4b9-569c-4648-88bc-377d83007c53\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.471232 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle\") pod \"31b3a4b9-569c-4648-88bc-377d83007c53\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.471381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42kc\" (UniqueName: \"kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc\") pod \"31b3a4b9-569c-4648-88bc-377d83007c53\" (UID: \"31b3a4b9-569c-4648-88bc-377d83007c53\") " Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.481590 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts" (OuterVolumeSpecName: "scripts") pod "31b3a4b9-569c-4648-88bc-377d83007c53" (UID: "31b3a4b9-569c-4648-88bc-377d83007c53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.482726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc" (OuterVolumeSpecName: "kube-api-access-k42kc") pod "31b3a4b9-569c-4648-88bc-377d83007c53" (UID: "31b3a4b9-569c-4648-88bc-377d83007c53"). InnerVolumeSpecName "kube-api-access-k42kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.522879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31b3a4b9-569c-4648-88bc-377d83007c53" (UID: "31b3a4b9-569c-4648-88bc-377d83007c53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.542306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data" (OuterVolumeSpecName: "config-data") pod "31b3a4b9-569c-4648-88bc-377d83007c53" (UID: "31b3a4b9-569c-4648-88bc-377d83007c53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.574947 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42kc\" (UniqueName: \"kubernetes.io/projected/31b3a4b9-569c-4648-88bc-377d83007c53-kube-api-access-k42kc\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.575168 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.575327 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.575404 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b3a4b9-569c-4648-88bc-377d83007c53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.908715 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerID="cfe54626451b26876a8fa637d0196c3a247154ff67988db7f58cf5b1db5d4b1a" exitCode=0 Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.909077 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerID="42f2a5e13832434ef374dcf60639148caadce7c2d8e31df28a354261d6184932" exitCode=2 Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.909088 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerID="68dbafeb9318baf2c027b788f3e66dffc0959a917f4bf3aec64bc8a4c9535ef8" exitCode=0 Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.908842 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerDied","Data":"cfe54626451b26876a8fa637d0196c3a247154ff67988db7f58cf5b1db5d4b1a"} Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.909148 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerDied","Data":"42f2a5e13832434ef374dcf60639148caadce7c2d8e31df28a354261d6184932"} Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.909162 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerDied","Data":"68dbafeb9318baf2c027b788f3e66dffc0959a917f4bf3aec64bc8a4c9535ef8"} Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.915647 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ncptd" Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.919499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ncptd" event={"ID":"31b3a4b9-569c-4648-88bc-377d83007c53","Type":"ContainerDied","Data":"9e22fd6542bd80552afd9dd1465a7cedbcb4bad5f1078e12b59037fe007bca71"} Mar 14 05:54:58 crc kubenswrapper[4713]: I0314 05:54:58.919552 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e22fd6542bd80552afd9dd1465a7cedbcb4bad5f1078e12b59037fe007bca71" Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.043887 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.044453 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-log" containerID="cri-o://96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc" gracePeriod=30 Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.044679 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-api" containerID="cri-o://73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206" gracePeriod=30 Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.076469 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.576984 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" path="/var/lib/kubelet/pods/13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3/volumes" Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.930164 4713 generic.go:334] "Generic (PLEG): container finished" podID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerID="96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc" exitCode=143 Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.930340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerDied","Data":"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc"} Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.933386 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f9aceec-678a-410d-be9f-e5a6a1116c0b" containerID="a151e8bfc0f807624971a02bebfee38264656b9c81d9f9d16d7de62503783952" exitCode=0 Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.933471 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" event={"ID":"3f9aceec-678a-410d-be9f-e5a6a1116c0b","Type":"ContainerDied","Data":"a151e8bfc0f807624971a02bebfee38264656b9c81d9f9d16d7de62503783952"} Mar 14 05:54:59 crc kubenswrapper[4713]: I0314 05:54:59.933594 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerName="nova-scheduler-scheduler" containerID="cri-o://c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" gracePeriod=30 Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.437995 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-958lf"] Mar 14 05:55:00 crc kubenswrapper[4713]: E0314 05:55:00.441189 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="dnsmasq-dns" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441261 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="dnsmasq-dns" Mar 14 05:55:00 crc kubenswrapper[4713]: E0314 05:55:00.441384 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="init" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441401 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="init" Mar 14 05:55:00 crc kubenswrapper[4713]: E0314 05:55:00.441425 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b3a4b9-569c-4648-88bc-377d83007c53" containerName="nova-manage" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441432 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b3a4b9-569c-4648-88bc-377d83007c53" containerName="nova-manage" Mar 14 05:55:00 crc kubenswrapper[4713]: E0314 05:55:00.441459 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" containerName="heat-cfnapi" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441467 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" containerName="heat-cfnapi" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441945 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b3a4b9-569c-4648-88bc-377d83007c53" containerName="nova-manage" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.441973 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="13495e5d-b3b5-46aa-8f15-4f6f8d4e85d3" containerName="dnsmasq-dns" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.442001 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49e8d24-8ec7-4c69-ba02-eacf7a6fca9c" containerName="heat-cfnapi" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.443405 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.465279 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-958lf"] Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.532383 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fcjb\" (UniqueName: \"kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.533031 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.590725 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-38dc-account-create-update-jstb7"] Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.593263 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.595895 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.604589 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-38dc-account-create-update-jstb7"] Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.634931 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.635160 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddsc\" (UniqueName: \"kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.635323 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.635396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fcjb\" (UniqueName: \"kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.637408 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.664060 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fcjb\" (UniqueName: \"kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb\") pod \"aodh-db-create-958lf\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.737309 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddsc\" (UniqueName: \"kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.737730 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.738662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.757961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddsc\" (UniqueName: \"kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc\") pod \"aodh-38dc-account-create-update-jstb7\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.770892 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-958lf" Mar 14 05:55:00 crc kubenswrapper[4713]: I0314 05:55:00.911970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:01 crc kubenswrapper[4713]: E0314 05:55:01.019438 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:01 crc kubenswrapper[4713]: E0314 05:55:01.023248 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:01 crc kubenswrapper[4713]: E0314 05:55:01.037333 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:01 crc kubenswrapper[4713]: E0314 05:55:01.037412 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerName="nova-scheduler-scheduler" Mar 14 05:55:01 crc kubenswrapper[4713]: W0314 05:55:01.368571 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66beacbe_5dac_4e64_a18a_092775f976ca.slice/crio-96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9 WatchSource:0}: Error finding container 96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9: Status 404 returned error can't find the container with id 96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9 Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.425361 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-958lf"] Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.456928 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.557811 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data\") pod \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.558088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts\") pod \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.558500 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwblv\" (UniqueName: \"kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv\") pod \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.558695 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle\") pod \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\" (UID: \"3f9aceec-678a-410d-be9f-e5a6a1116c0b\") " Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.565198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv" (OuterVolumeSpecName: "kube-api-access-lwblv") pod "3f9aceec-678a-410d-be9f-e5a6a1116c0b" (UID: "3f9aceec-678a-410d-be9f-e5a6a1116c0b"). InnerVolumeSpecName "kube-api-access-lwblv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.565972 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts" (OuterVolumeSpecName: "scripts") pod "3f9aceec-678a-410d-be9f-e5a6a1116c0b" (UID: "3f9aceec-678a-410d-be9f-e5a6a1116c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.613644 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data" (OuterVolumeSpecName: "config-data") pod "3f9aceec-678a-410d-be9f-e5a6a1116c0b" (UID: "3f9aceec-678a-410d-be9f-e5a6a1116c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.647793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9aceec-678a-410d-be9f-e5a6a1116c0b" (UID: "3f9aceec-678a-410d-be9f-e5a6a1116c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.662117 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.662157 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.662168 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwblv\" (UniqueName: \"kubernetes.io/projected/3f9aceec-678a-410d-be9f-e5a6a1116c0b-kube-api-access-lwblv\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.662181 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9aceec-678a-410d-be9f-e5a6a1116c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.679504 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-38dc-account-create-update-jstb7"] Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.962485 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerID="c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" exitCode=0 Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.962549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb040c49-5ad0-4283-a5cc-dc40275b5284","Type":"ContainerDied","Data":"c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.962619 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb040c49-5ad0-4283-a5cc-dc40275b5284","Type":"ContainerDied","Data":"d87fdbe1a242c70be1995e890e7a2005c6c37db76113d065dfb9542df19d65be"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.962634 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87fdbe1a242c70be1995e890e7a2005c6c37db76113d065dfb9542df19d65be" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.964484 4713 generic.go:334] "Generic (PLEG): container finished" podID="66beacbe-5dac-4e64-a18a-092775f976ca" containerID="b02a0c7449b8cc3ded2bbce1e906b44b18d8684e7dc3a27ea2d7930d20b8753a" exitCode=0 Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.964583 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-958lf" event={"ID":"66beacbe-5dac-4e64-a18a-092775f976ca","Type":"ContainerDied","Data":"b02a0c7449b8cc3ded2bbce1e906b44b18d8684e7dc3a27ea2d7930d20b8753a"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.964632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-958lf" event={"ID":"66beacbe-5dac-4e64-a18a-092775f976ca","Type":"ContainerStarted","Data":"96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.966142 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-38dc-account-create-update-jstb7" event={"ID":"7058c887-6736-417a-83c0-18e8e9ca53f3","Type":"ContainerStarted","Data":"6291c2530762c461841e24bd60b23c1dfcdeefd90991171c66cb012080a2f6ee"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.971722 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" event={"ID":"3f9aceec-678a-410d-be9f-e5a6a1116c0b","Type":"ContainerDied","Data":"4af1ed7701a29ff0cc628b3e3efbdd2d3603f98546cd1a28cca1da347380222d"} Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.971772 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af1ed7701a29ff0cc628b3e3efbdd2d3603f98546cd1a28cca1da347380222d" Mar 14 05:55:01 crc kubenswrapper[4713]: I0314 05:55:01.971845 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xmbd7" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.042858 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.043552 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 05:55:02 crc kubenswrapper[4713]: E0314 05:55:02.044035 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerName="nova-scheduler-scheduler" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.044048 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerName="nova-scheduler-scheduler" Mar 14 05:55:02 crc kubenswrapper[4713]: E0314 05:55:02.044065 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9aceec-678a-410d-be9f-e5a6a1116c0b" containerName="nova-cell1-conductor-db-sync" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.044072 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9aceec-678a-410d-be9f-e5a6a1116c0b" containerName="nova-cell1-conductor-db-sync" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.044297 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" containerName="nova-scheduler-scheduler" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.044334 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9aceec-678a-410d-be9f-e5a6a1116c0b" containerName="nova-cell1-conductor-db-sync" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.045098 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.047561 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.071772 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6rfw\" (UniqueName: \"kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw\") pod \"fb040c49-5ad0-4283-a5cc-dc40275b5284\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.072002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data\") pod \"fb040c49-5ad0-4283-a5cc-dc40275b5284\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.072154 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle\") pod \"fb040c49-5ad0-4283-a5cc-dc40275b5284\" (UID: \"fb040c49-5ad0-4283-a5cc-dc40275b5284\") " Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.077572 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw" (OuterVolumeSpecName: "kube-api-access-t6rfw") pod "fb040c49-5ad0-4283-a5cc-dc40275b5284" (UID: "fb040c49-5ad0-4283-a5cc-dc40275b5284"). InnerVolumeSpecName "kube-api-access-t6rfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.077728 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.078000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75bdj\" (UniqueName: \"kubernetes.io/projected/94aaad06-2233-4127-9b26-d9fc2b6ff597-kube-api-access-75bdj\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.078173 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.082305 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.090865 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6rfw\" (UniqueName: \"kubernetes.io/projected/fb040c49-5ad0-4283-a5cc-dc40275b5284-kube-api-access-t6rfw\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.131435 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data" (OuterVolumeSpecName: "config-data") pod "fb040c49-5ad0-4283-a5cc-dc40275b5284" (UID: "fb040c49-5ad0-4283-a5cc-dc40275b5284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.133651 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb040c49-5ad0-4283-a5cc-dc40275b5284" (UID: "fb040c49-5ad0-4283-a5cc-dc40275b5284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.192670 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.192909 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.193014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75bdj\" (UniqueName: \"kubernetes.io/projected/94aaad06-2233-4127-9b26-d9fc2b6ff597-kube-api-access-75bdj\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.193091 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.193103 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb040c49-5ad0-4283-a5cc-dc40275b5284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.196920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.197004 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aaad06-2233-4127-9b26-d9fc2b6ff597-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.211093 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75bdj\" (UniqueName: \"kubernetes.io/projected/94aaad06-2233-4127-9b26-d9fc2b6ff597-kube-api-access-75bdj\") pod \"nova-cell1-conductor-0\" (UID: \"94aaad06-2233-4127-9b26-d9fc2b6ff597\") " pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.386084 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:02 crc kubenswrapper[4713]: E0314 05:55:02.740487 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:02 crc kubenswrapper[4713]: I0314 05:55:02.872554 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.000239 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"94aaad06-2233-4127-9b26-d9fc2b6ff597","Type":"ContainerStarted","Data":"ac712034c5a39db6c2666b3c36eb32268ab38c77eb4513923cf97d6ae0f14c74"} Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.002824 4713 generic.go:334] "Generic (PLEG): container finished" podID="7058c887-6736-417a-83c0-18e8e9ca53f3" containerID="b1d46683989bb8a0b72c9fb1594ba638639a58ed0ac497ed47cd3be3d800aa38" exitCode=0 Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.003149 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-38dc-account-create-update-jstb7" event={"ID":"7058c887-6736-417a-83c0-18e8e9ca53f3","Type":"ContainerDied","Data":"b1d46683989bb8a0b72c9fb1594ba638639a58ed0ac497ed47cd3be3d800aa38"} Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.003343 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.153717 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.166413 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.184570 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.187071 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.189866 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.197826 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.217547 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.217632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.217725 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6jrz\" (UniqueName: \"kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.324919 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6jrz\" (UniqueName: \"kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.325366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.325554 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.335551 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.335798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.347774 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6jrz\" (UniqueName: \"kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz\") pod \"nova-scheduler-0\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.443417 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-958lf" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.515612 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.539420 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fcjb\" (UniqueName: \"kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb\") pod \"66beacbe-5dac-4e64-a18a-092775f976ca\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.539587 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts\") pod \"66beacbe-5dac-4e64-a18a-092775f976ca\" (UID: \"66beacbe-5dac-4e64-a18a-092775f976ca\") " Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.540993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66beacbe-5dac-4e64-a18a-092775f976ca" (UID: "66beacbe-5dac-4e64-a18a-092775f976ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.554939 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb" (OuterVolumeSpecName: "kube-api-access-8fcjb") pod "66beacbe-5dac-4e64-a18a-092775f976ca" (UID: "66beacbe-5dac-4e64-a18a-092775f976ca"). InnerVolumeSpecName "kube-api-access-8fcjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.582015 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb040c49-5ad0-4283-a5cc-dc40275b5284" path="/var/lib/kubelet/pods/fb040c49-5ad0-4283-a5cc-dc40275b5284/volumes" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.643556 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66beacbe-5dac-4e64-a18a-092775f976ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.643606 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fcjb\" (UniqueName: \"kubernetes.io/projected/66beacbe-5dac-4e64-a18a-092775f976ca-kube-api-access-8fcjb\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.840770 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:55:03 crc kubenswrapper[4713]: I0314 05:55:03.841161 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.019681 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.030572 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"94aaad06-2233-4127-9b26-d9fc2b6ff597","Type":"ContainerStarted","Data":"7f79a12554b2bf76e1348728cd5ed62d4ce7c80714e722f448f379a770ebab68"} Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.030655 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.041471 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.041561 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.043670 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-958lf" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.043671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-958lf" event={"ID":"66beacbe-5dac-4e64-a18a-092775f976ca","Type":"ContainerDied","Data":"96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9"} Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.044055 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ca229ea7aa7e756d96e706d0e3a316e192a346897599b07dc43a74590dc8f9" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.051507 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs\") pod \"f67d9142-27fa-4792-aa82-243d1d036c7f\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.051599 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle\") pod \"f67d9142-27fa-4792-aa82-243d1d036c7f\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.052176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs" (OuterVolumeSpecName: "logs") pod "f67d9142-27fa-4792-aa82-243d1d036c7f" (UID: "f67d9142-27fa-4792-aa82-243d1d036c7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.052643 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86rgc\" (UniqueName: \"kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc\") pod \"f67d9142-27fa-4792-aa82-243d1d036c7f\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.052717 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data\") pod \"f67d9142-27fa-4792-aa82-243d1d036c7f\" (UID: \"f67d9142-27fa-4792-aa82-243d1d036c7f\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.053884 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67d9142-27fa-4792-aa82-243d1d036c7f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.060561 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc" (OuterVolumeSpecName: "kube-api-access-86rgc") pod "f67d9142-27fa-4792-aa82-243d1d036c7f" (UID: "f67d9142-27fa-4792-aa82-243d1d036c7f"). InnerVolumeSpecName "kube-api-access-86rgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.060880 4713 generic.go:334] "Generic (PLEG): container finished" podID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerID="73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206" exitCode=0 Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.061440 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerDied","Data":"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206"} Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.061512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f67d9142-27fa-4792-aa82-243d1d036c7f","Type":"ContainerDied","Data":"a4f65b6d0e4fcf336d980c383113117a6597b6acd15c7e994ef6398b7498fc10"} Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.061551 4713 scope.go:117] "RemoveContainer" containerID="73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.061691 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: W0314 05:55:04.110483 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f5eb7b_a993_4198_9ed7_4b7522223fb7.slice/crio-8592a757bdd2c58cc8fb38b13efa42561cfd7bf25e1c61206c5893f2f1f28052 WatchSource:0}: Error finding container 8592a757bdd2c58cc8fb38b13efa42561cfd7bf25e1c61206c5893f2f1f28052: Status 404 returned error can't find the container with id 8592a757bdd2c58cc8fb38b13efa42561cfd7bf25e1c61206c5893f2f1f28052 Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.115917 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f67d9142-27fa-4792-aa82-243d1d036c7f" (UID: "f67d9142-27fa-4792-aa82-243d1d036c7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.115918 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.115896898 podStartE2EDuration="2.115896898s" podCreationTimestamp="2026-03-14 05:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:04.06118069 +0000 UTC m=+1687.149089990" watchObservedRunningTime="2026-03-14 05:55:04.115896898 +0000 UTC m=+1687.203806188" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.158005 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.158783 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.158819 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86rgc\" (UniqueName: \"kubernetes.io/projected/f67d9142-27fa-4792-aa82-243d1d036c7f-kube-api-access-86rgc\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.167118 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data" (OuterVolumeSpecName: "config-data") pod "f67d9142-27fa-4792-aa82-243d1d036c7f" (UID: "f67d9142-27fa-4792-aa82-243d1d036c7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.257873 4713 scope.go:117] "RemoveContainer" containerID="96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.261341 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67d9142-27fa-4792-aa82-243d1d036c7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.289488 4713 scope.go:117] "RemoveContainer" containerID="73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.290530 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206\": container with ID starting with 73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206 not found: ID does not exist" containerID="73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.290568 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206"} err="failed to get container status \"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206\": rpc error: code = NotFound desc = could not find container \"73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206\": container with ID starting with 73b36ed9448652be6b485f2f7862f5b70de06aa86cdd2158f0ff38ea1888f206 not found: ID does not exist" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.290593 4713 scope.go:117] "RemoveContainer" containerID="96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.290962 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc\": container with ID starting with 96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc not found: ID does not exist" containerID="96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.291008 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc"} err="failed to get container status \"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc\": rpc error: code = NotFound desc = could not find container \"96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc\": container with ID starting with 96ec35d3ee4ae9f4ff3b9ca6df13fc3881d90e04dfd3d0f56bc8500d1341f8fc not found: ID does not exist" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.521686 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.524158 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.541648 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.557979 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.558484 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7058c887-6736-417a-83c0-18e8e9ca53f3" containerName="mariadb-account-create-update" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558500 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7058c887-6736-417a-83c0-18e8e9ca53f3" containerName="mariadb-account-create-update" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.558520 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-log" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558527 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-log" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.558545 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66beacbe-5dac-4e64-a18a-092775f976ca" containerName="mariadb-database-create" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558550 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="66beacbe-5dac-4e64-a18a-092775f976ca" containerName="mariadb-database-create" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.558563 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-api" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558569 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-api" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558819 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-log" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558835 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7058c887-6736-417a-83c0-18e8e9ca53f3" containerName="mariadb-account-create-update" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558853 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="66beacbe-5dac-4e64-a18a-092775f976ca" containerName="mariadb-database-create" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.558861 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" containerName="nova-api-api" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.560157 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.564808 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.579119 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddsc\" (UniqueName: \"kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc\") pod \"7058c887-6736-417a-83c0-18e8e9ca53f3\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.579166 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts\") pod \"7058c887-6736-417a-83c0-18e8e9ca53f3\" (UID: \"7058c887-6736-417a-83c0-18e8e9ca53f3\") " Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.579986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.580107 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7058c887-6736-417a-83c0-18e8e9ca53f3" (UID: "7058c887-6736-417a-83c0-18e8e9ca53f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.580798 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.580930 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.581091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fsqs\" (UniqueName: \"kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.581265 4713 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7058c887-6736-417a-83c0-18e8e9ca53f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.585605 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc" (OuterVolumeSpecName: "kube-api-access-4ddsc") pod "7058c887-6736-417a-83c0-18e8e9ca53f3" (UID: "7058c887-6736-417a-83c0-18e8e9ca53f3"). InnerVolumeSpecName "kube-api-access-4ddsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.588467 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.686277 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.686667 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.686821 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fsqs\" (UniqueName: \"kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.687712 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.687867 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddsc\" (UniqueName: \"kubernetes.io/projected/7058c887-6736-417a-83c0-18e8e9ca53f3-kube-api-access-4ddsc\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.688394 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.690907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.700046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.708942 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fsqs\" (UniqueName: \"kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs\") pod \"nova-api-0\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " pod="openstack/nova-api-0" Mar 14 05:55:04 crc kubenswrapper[4713]: E0314 05:55:04.730025 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:04 crc kubenswrapper[4713]: I0314 05:55:04.882891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.075617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-38dc-account-create-update-jstb7" event={"ID":"7058c887-6736-417a-83c0-18e8e9ca53f3","Type":"ContainerDied","Data":"6291c2530762c461841e24bd60b23c1dfcdeefd90991171c66cb012080a2f6ee"} Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.075852 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6291c2530762c461841e24bd60b23c1dfcdeefd90991171c66cb012080a2f6ee" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.075954 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-38dc-account-create-update-jstb7" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.088198 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63f5eb7b-a993-4198-9ed7-4b7522223fb7","Type":"ContainerStarted","Data":"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85"} Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.088254 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63f5eb7b-a993-4198-9ed7-4b7522223fb7","Type":"ContainerStarted","Data":"8592a757bdd2c58cc8fb38b13efa42561cfd7bf25e1c61206c5893f2f1f28052"} Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.119734 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.119701044 podStartE2EDuration="2.119701044s" podCreationTimestamp="2026-03-14 05:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:05.103677813 +0000 UTC m=+1688.191587113" watchObservedRunningTime="2026-03-14 05:55:05.119701044 +0000 UTC m=+1688.207610344" Mar 14 05:55:05 crc kubenswrapper[4713]: W0314 05:55:05.374746 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf082c59_dad5_4273_b6f1_7f9bce22249a.slice/crio-bb454dfbaecb8563d942ab0a701f52a05794f5514d5984fb3e7d575e999b93b4 WatchSource:0}: Error finding container bb454dfbaecb8563d942ab0a701f52a05794f5514d5984fb3e7d575e999b93b4: Status 404 returned error can't find the container with id bb454dfbaecb8563d942ab0a701f52a05794f5514d5984fb3e7d575e999b93b4 Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.376830 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.577805 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67d9142-27fa-4792-aa82-243d1d036c7f" path="/var/lib/kubelet/pods/f67d9142-27fa-4792-aa82-243d1d036c7f/volumes" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.939769 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-qbgdm"] Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.941704 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.943964 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.944129 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.943964 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.945966 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2n7nm" Mar 14 05:55:05 crc kubenswrapper[4713]: I0314 05:55:05.955188 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qbgdm"] Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.026313 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxfkv\" (UniqueName: \"kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.026503 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.026749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.026879 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.100301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerStarted","Data":"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e"} Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.100355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerStarted","Data":"bb454dfbaecb8563d942ab0a701f52a05794f5514d5984fb3e7d575e999b93b4"} Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.129441 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxfkv\" (UniqueName: \"kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.129556 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.129638 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.129726 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.134771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.137780 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.140148 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.152072 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxfkv\" (UniqueName: \"kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv\") pod \"aodh-db-sync-qbgdm\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.269591 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:06 crc kubenswrapper[4713]: W0314 05:55:06.793505 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29070117_9a34_485a_bd43_2d2ea1d65e00.slice/crio-38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48 WatchSource:0}: Error finding container 38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48: Status 404 returned error can't find the container with id 38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48 Mar 14 05:55:06 crc kubenswrapper[4713]: I0314 05:55:06.794797 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qbgdm"] Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.114229 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerStarted","Data":"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1"} Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.118106 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerID="20011c95aeccd5bb5f40ebe9c34709dc507e19f8af4c4da321b7588f818b892f" exitCode=0 Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.118244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerDied","Data":"20011c95aeccd5bb5f40ebe9c34709dc507e19f8af4c4da321b7588f818b892f"} Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.118288 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8f9166f-6772-4ca5-8d81-a4c2358afc44","Type":"ContainerDied","Data":"0f3531c376316897c55e57c2d74777c7b8cecaa42a079e1a8c89bf10a74f5a03"} Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.118305 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3531c376316897c55e57c2d74777c7b8cecaa42a079e1a8c89bf10a74f5a03" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.119328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qbgdm" event={"ID":"29070117-9a34-485a-bd43-2d2ea1d65e00","Type":"ContainerStarted","Data":"38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48"} Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.135343 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.159673 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.160146 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.160584 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.160777 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbhg5\" (UniqueName: \"kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.160974 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.162616 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.169579 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5" (OuterVolumeSpecName: "kube-api-access-xbhg5") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "kube-api-access-xbhg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.172077 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.17205128 podStartE2EDuration="3.17205128s" podCreationTimestamp="2026-03-14 05:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:07.165820088 +0000 UTC m=+1690.253729398" watchObservedRunningTime="2026-03-14 05:55:07.17205128 +0000 UTC m=+1690.259960580" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.182437 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts" (OuterVolumeSpecName: "scripts") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.265411 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.265480 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.265551 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd\") pod \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\" (UID: \"b8f9166f-6772-4ca5-8d81-a4c2358afc44\") " Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.266142 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.266164 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbhg5\" (UniqueName: \"kubernetes.io/projected/b8f9166f-6772-4ca5-8d81-a4c2358afc44-kube-api-access-xbhg5\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.286395 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.338366 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.370708 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.370748 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8f9166f-6772-4ca5-8d81-a4c2358afc44-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.401505 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.451413 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data" (OuterVolumeSpecName: "config-data") pod "b8f9166f-6772-4ca5-8d81-a4c2358afc44" (UID: "b8f9166f-6772-4ca5-8d81-a4c2358afc44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.473074 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:07 crc kubenswrapper[4713]: I0314 05:55:07.473308 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f9166f-6772-4ca5-8d81-a4c2358afc44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.129654 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.167313 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.183882 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198088 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:08 crc kubenswrapper[4713]: E0314 05:55:08.198651 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-notification-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198672 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-notification-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: E0314 05:55:08.198700 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="proxy-httpd" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198706 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="proxy-httpd" Mar 14 05:55:08 crc kubenswrapper[4713]: E0314 05:55:08.198722 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-central-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198728 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-central-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: E0314 05:55:08.198736 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="sg-core" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198742 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="sg-core" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198964 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="proxy-httpd" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.198982 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="sg-core" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.199000 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-notification-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.199014 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" containerName="ceilometer-central-agent" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.201269 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.204274 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.204568 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.233131 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298391 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lkz\" (UniqueName: \"kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298595 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298794 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298879 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.298961 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.299014 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.401333 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.401439 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.401558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lkz\" (UniqueName: \"kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.401882 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.402106 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.402273 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.402344 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.402409 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.402935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.407433 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.408398 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.410261 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.412419 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.421446 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lkz\" (UniqueName: \"kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz\") pod \"ceilometer-0\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " pod="openstack/ceilometer-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.516927 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 05:55:08 crc kubenswrapper[4713]: I0314 05:55:08.532620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:09 crc kubenswrapper[4713]: I0314 05:55:09.074154 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:09 crc kubenswrapper[4713]: I0314 05:55:09.579061 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f9166f-6772-4ca5-8d81-a4c2358afc44" path="/var/lib/kubelet/pods/b8f9166f-6772-4ca5-8d81-a4c2358afc44/volumes" Mar 14 05:55:10 crc kubenswrapper[4713]: I0314 05:55:10.731891 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:55:10 crc kubenswrapper[4713]: I0314 05:55:10.732157 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:55:10 crc kubenswrapper[4713]: W0314 05:55:10.996450 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode31f5798_c01e_427d_9db5_e955ae1b383b.slice/crio-b7f7c2582fb26202b28aa8df9c4f37ef3dcbdff5174e362ef326571024542cc5 WatchSource:0}: Error finding container b7f7c2582fb26202b28aa8df9c4f37ef3dcbdff5174e362ef326571024542cc5: Status 404 returned error can't find the container with id b7f7c2582fb26202b28aa8df9c4f37ef3dcbdff5174e362ef326571024542cc5 Mar 14 05:55:11 crc kubenswrapper[4713]: I0314 05:55:11.168145 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerStarted","Data":"b7f7c2582fb26202b28aa8df9c4f37ef3dcbdff5174e362ef326571024542cc5"} Mar 14 05:55:12 crc kubenswrapper[4713]: I0314 05:55:12.183276 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qbgdm" event={"ID":"29070117-9a34-485a-bd43-2d2ea1d65e00","Type":"ContainerStarted","Data":"b789bc5a97672113bc496f3ba7a6133874097474ec2ff68ab4098765c81dd3d7"} Mar 14 05:55:12 crc kubenswrapper[4713]: I0314 05:55:12.188293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerStarted","Data":"ff3d108eb1989359ed3887ac833c162ed2e00584476e8cb0b15006618fd1b606"} Mar 14 05:55:12 crc kubenswrapper[4713]: I0314 05:55:12.207118 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-qbgdm" podStartSLOduration=2.9555014010000003 podStartE2EDuration="7.207096736s" podCreationTimestamp="2026-03-14 05:55:05 +0000 UTC" firstStartedPulling="2026-03-14 05:55:06.798055603 +0000 UTC m=+1689.885964903" lastFinishedPulling="2026-03-14 05:55:11.049650938 +0000 UTC m=+1694.137560238" observedRunningTime="2026-03-14 05:55:12.198079819 +0000 UTC m=+1695.285989119" watchObservedRunningTime="2026-03-14 05:55:12.207096736 +0000 UTC m=+1695.295006036" Mar 14 05:55:12 crc kubenswrapper[4713]: I0314 05:55:12.419964 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 05:55:13 crc kubenswrapper[4713]: I0314 05:55:13.203120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerStarted","Data":"cda99b21bb4951367e8ce56606305f6c6194d1e1461985bddd8e84543cde7874"} Mar 14 05:55:13 crc kubenswrapper[4713]: I0314 05:55:13.203474 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerStarted","Data":"65425c6b60d661a087040d1e71fa8ed9e5a9378ab2832007f252858f6863405c"} Mar 14 05:55:13 crc kubenswrapper[4713]: I0314 05:55:13.517259 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 05:55:13 crc kubenswrapper[4713]: I0314 05:55:13.558621 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 05:55:14 crc kubenswrapper[4713]: I0314 05:55:14.216790 4713 generic.go:334] "Generic (PLEG): container finished" podID="29070117-9a34-485a-bd43-2d2ea1d65e00" containerID="b789bc5a97672113bc496f3ba7a6133874097474ec2ff68ab4098765c81dd3d7" exitCode=0 Mar 14 05:55:14 crc kubenswrapper[4713]: I0314 05:55:14.216875 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qbgdm" event={"ID":"29070117-9a34-485a-bd43-2d2ea1d65e00","Type":"ContainerDied","Data":"b789bc5a97672113bc496f3ba7a6133874097474ec2ff68ab4098765c81dd3d7"} Mar 14 05:55:14 crc kubenswrapper[4713]: I0314 05:55:14.249682 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 05:55:14 crc kubenswrapper[4713]: I0314 05:55:14.883651 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:55:14 crc kubenswrapper[4713]: I0314 05:55:14.884310 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:55:15 crc kubenswrapper[4713]: E0314 05:55:15.073133 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.231701 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerStarted","Data":"ae62fdb00c3bc71d4e7614523fc39c4de372fedda5804c520382257ea1525335"} Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.233509 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.266749 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.685918177 podStartE2EDuration="7.266721835s" podCreationTimestamp="2026-03-14 05:55:08 +0000 UTC" firstStartedPulling="2026-03-14 05:55:10.99915244 +0000 UTC m=+1694.087061740" lastFinishedPulling="2026-03-14 05:55:14.579956078 +0000 UTC m=+1697.667865398" observedRunningTime="2026-03-14 05:55:15.259109341 +0000 UTC m=+1698.347018641" watchObservedRunningTime="2026-03-14 05:55:15.266721835 +0000 UTC m=+1698.354631135" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.664098 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.807655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle\") pod \"29070117-9a34-485a-bd43-2d2ea1d65e00\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.808323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data\") pod \"29070117-9a34-485a-bd43-2d2ea1d65e00\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.808530 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxfkv\" (UniqueName: \"kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv\") pod \"29070117-9a34-485a-bd43-2d2ea1d65e00\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.808627 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts\") pod \"29070117-9a34-485a-bd43-2d2ea1d65e00\" (UID: \"29070117-9a34-485a-bd43-2d2ea1d65e00\") " Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.814051 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv" (OuterVolumeSpecName: "kube-api-access-rxfkv") pod "29070117-9a34-485a-bd43-2d2ea1d65e00" (UID: "29070117-9a34-485a-bd43-2d2ea1d65e00"). InnerVolumeSpecName "kube-api-access-rxfkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.814142 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts" (OuterVolumeSpecName: "scripts") pod "29070117-9a34-485a-bd43-2d2ea1d65e00" (UID: "29070117-9a34-485a-bd43-2d2ea1d65e00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.840643 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data" (OuterVolumeSpecName: "config-data") pod "29070117-9a34-485a-bd43-2d2ea1d65e00" (UID: "29070117-9a34-485a-bd43-2d2ea1d65e00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.845904 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29070117-9a34-485a-bd43-2d2ea1d65e00" (UID: "29070117-9a34-485a-bd43-2d2ea1d65e00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.911895 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.911938 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.911951 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxfkv\" (UniqueName: \"kubernetes.io/projected/29070117-9a34-485a-bd43-2d2ea1d65e00-kube-api-access-rxfkv\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.911965 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29070117-9a34-485a-bd43-2d2ea1d65e00-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.966901 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:15 crc kubenswrapper[4713]: I0314 05:55:15.966477 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:16 crc kubenswrapper[4713]: I0314 05:55:16.245166 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qbgdm" Mar 14 05:55:16 crc kubenswrapper[4713]: I0314 05:55:16.250645 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qbgdm" event={"ID":"29070117-9a34-485a-bd43-2d2ea1d65e00","Type":"ContainerDied","Data":"38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48"} Mar 14 05:55:16 crc kubenswrapper[4713]: I0314 05:55:16.250729 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38e5d71154475c97e817feaff57eda5558993fabbad0800be47d7f022494dc48" Mar 14 05:55:17 crc kubenswrapper[4713]: E0314 05:55:17.605026 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.503316 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 14 05:55:20 crc kubenswrapper[4713]: E0314 05:55:20.505084 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29070117-9a34-485a-bd43-2d2ea1d65e00" containerName="aodh-db-sync" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.505106 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="29070117-9a34-485a-bd43-2d2ea1d65e00" containerName="aodh-db-sync" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.505469 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="29070117-9a34-485a-bd43-2d2ea1d65e00" containerName="aodh-db-sync" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.508303 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.511712 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2n7nm" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.512150 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.519822 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.528386 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.633767 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wnf\" (UniqueName: \"kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.633878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.633938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.633972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.736707 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wnf\" (UniqueName: \"kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.736846 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.736920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.736965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.745414 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.747386 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.758864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.763799 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wnf\" (UniqueName: \"kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf\") pod \"aodh-0\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " pod="openstack/aodh-0" Mar 14 05:55:20 crc kubenswrapper[4713]: I0314 05:55:20.847365 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:55:21 crc kubenswrapper[4713]: I0314 05:55:21.438851 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:55:22 crc kubenswrapper[4713]: I0314 05:55:22.310538 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerStarted","Data":"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6"} Mar 14 05:55:22 crc kubenswrapper[4713]: I0314 05:55:22.310878 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerStarted","Data":"da33f2b3234f85003641b9627b52457ef2bd5465293cc0ea5ae142244892f29e"} Mar 14 05:55:22 crc kubenswrapper[4713]: I0314 05:55:22.883163 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:55:22 crc kubenswrapper[4713]: I0314 05:55:22.886290 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.007432 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.007814 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-central-agent" containerID="cri-o://ff3d108eb1989359ed3887ac833c162ed2e00584476e8cb0b15006618fd1b606" gracePeriod=30 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.008367 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="proxy-httpd" containerID="cri-o://ae62fdb00c3bc71d4e7614523fc39c4de372fedda5804c520382257ea1525335" gracePeriod=30 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.008430 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-notification-agent" containerID="cri-o://65425c6b60d661a087040d1e71fa8ed9e5a9378ab2832007f252858f6863405c" gracePeriod=30 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.008583 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="sg-core" containerID="cri-o://cda99b21bb4951367e8ce56606305f6c6194d1e1461985bddd8e84543cde7874" gracePeriod=30 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.333701 4713 generic.go:334] "Generic (PLEG): container finished" podID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerID="ae62fdb00c3bc71d4e7614523fc39c4de372fedda5804c520382257ea1525335" exitCode=0 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.336019 4713 generic.go:334] "Generic (PLEG): container finished" podID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerID="cda99b21bb4951367e8ce56606305f6c6194d1e1461985bddd8e84543cde7874" exitCode=2 Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.333781 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerDied","Data":"ae62fdb00c3bc71d4e7614523fc39c4de372fedda5804c520382257ea1525335"} Mar 14 05:55:23 crc kubenswrapper[4713]: I0314 05:55:23.336413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerDied","Data":"cda99b21bb4951367e8ce56606305f6c6194d1e1461985bddd8e84543cde7874"} Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.498652 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.513432 4713 generic.go:334] "Generic (PLEG): container finished" podID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerID="65425c6b60d661a087040d1e71fa8ed9e5a9378ab2832007f252858f6863405c" exitCode=0 Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.513468 4713 generic.go:334] "Generic (PLEG): container finished" podID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerID="ff3d108eb1989359ed3887ac833c162ed2e00584476e8cb0b15006618fd1b606" exitCode=0 Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.513492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerDied","Data":"65425c6b60d661a087040d1e71fa8ed9e5a9378ab2832007f252858f6863405c"} Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.513525 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerDied","Data":"ff3d108eb1989359ed3887ac833c162ed2e00584476e8cb0b15006618fd1b606"} Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.887138 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.888647 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 05:55:24 crc kubenswrapper[4713]: I0314 05:55:24.895118 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.200851 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239164 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239321 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239391 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239535 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5lkz\" (UniqueName: \"kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239628 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.239671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml\") pod \"e31f5798-c01e-427d-9db5-e955ae1b383b\" (UID: \"e31f5798-c01e-427d-9db5-e955ae1b383b\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.240841 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.241259 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.242665 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.252421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts" (OuterVolumeSpecName: "scripts") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.259616 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz" (OuterVolumeSpecName: "kube-api-access-s5lkz") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "kube-api-access-s5lkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.318703 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.319342 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.356803 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.356917 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.356931 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e31f5798-c01e-427d-9db5-e955ae1b383b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.356945 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5lkz\" (UniqueName: \"kubernetes.io/projected/e31f5798-c01e-427d-9db5-e955ae1b383b-kube-api-access-s5lkz\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.356959 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.467987 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzwqf\" (UniqueName: \"kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf\") pod \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468137 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data\") pod \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468329 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmnp\" (UniqueName: \"kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp\") pod \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468373 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle\") pod \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468426 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") pod \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs\") pod \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.468652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle\") pod \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\" (UID: \"6f582f5f-e4b3-4a66-b366-4abfbbe100f1\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.478686 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs" (OuterVolumeSpecName: "logs") pod "6f582f5f-e4b3-4a66-b366-4abfbbe100f1" (UID: "6f582f5f-e4b3-4a66-b366-4abfbbe100f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.495490 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp" (OuterVolumeSpecName: "kube-api-access-bxmnp") pod "6f582f5f-e4b3-4a66-b366-4abfbbe100f1" (UID: "6f582f5f-e4b3-4a66-b366-4abfbbe100f1"). InnerVolumeSpecName "kube-api-access-bxmnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.513491 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf" (OuterVolumeSpecName: "kube-api-access-mzwqf") pod "a5b498d2-114c-44f2-b2d2-681d4bedbdbb" (UID: "a5b498d2-114c-44f2-b2d2-681d4bedbdbb"). InnerVolumeSpecName "kube-api-access-mzwqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.571922 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5b498d2-114c-44f2-b2d2-681d4bedbdbb" (UID: "a5b498d2-114c-44f2-b2d2-681d4bedbdbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.572320 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmnp\" (UniqueName: \"kubernetes.io/projected/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-kube-api-access-bxmnp\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.572371 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.572535 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzwqf\" (UniqueName: \"kubernetes.io/projected/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-kube-api-access-mzwqf\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.581598 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f582f5f-e4b3-4a66-b366-4abfbbe100f1" (UID: "6f582f5f-e4b3-4a66-b366-4abfbbe100f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.617503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.651805 4713 generic.go:334] "Generic (PLEG): container finished" podID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerID="b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337" exitCode=137 Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.652482 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.664707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data" (OuterVolumeSpecName: "config-data") pod "6f582f5f-e4b3-4a66-b366-4abfbbe100f1" (UID: "6f582f5f-e4b3-4a66-b366-4abfbbe100f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.673762 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data" (OuterVolumeSpecName: "config-data") pod "a5b498d2-114c-44f2-b2d2-681d4bedbdbb" (UID: "a5b498d2-114c-44f2-b2d2-681d4bedbdbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.674061 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") pod \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\" (UID: \"a5b498d2-114c-44f2-b2d2-681d4bedbdbb\") " Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.674788 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.674810 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.674820 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.674831 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f582f5f-e4b3-4a66-b366-4abfbbe100f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: W0314 05:55:25.675405 4713 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a5b498d2-114c-44f2-b2d2-681d4bedbdbb/volumes/kubernetes.io~secret/config-data Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.675435 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data" (OuterVolumeSpecName: "config-data") pod "a5b498d2-114c-44f2-b2d2-681d4bedbdbb" (UID: "a5b498d2-114c-44f2-b2d2-681d4bedbdbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.675611 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.687470 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data" (OuterVolumeSpecName: "config-data") pod "e31f5798-c01e-427d-9db5-e955ae1b383b" (UID: "e31f5798-c01e-427d-9db5-e955ae1b383b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.704305 4713 generic.go:334] "Generic (PLEG): container finished" podID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" containerID="ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c" exitCode=137 Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.706985 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.778361 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31f5798-c01e-427d-9db5-e955ae1b383b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.778402 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b498d2-114c-44f2-b2d2-681d4bedbdbb-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.784395 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834551 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834596 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerStarted","Data":"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834669 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerDied","Data":"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834686 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f582f5f-e4b3-4a66-b366-4abfbbe100f1","Type":"ContainerDied","Data":"7510b5e483f88cf69e6cd298e9f63e2524cf33fbb50abf1224363847aca55b7e"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834705 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e31f5798-c01e-427d-9db5-e955ae1b383b","Type":"ContainerDied","Data":"b7f7c2582fb26202b28aa8df9c4f37ef3dcbdff5174e362ef326571024542cc5"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5b498d2-114c-44f2-b2d2-681d4bedbdbb","Type":"ContainerDied","Data":"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5b498d2-114c-44f2-b2d2-681d4bedbdbb","Type":"ContainerDied","Data":"05bb35f238ecf88a35fb71efaf088916a76fc7253983258103fd40e23c03c1fd"} Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.834763 4713 scope.go:117] "RemoveContainer" containerID="b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.865091 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.882603 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.909662 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-notification-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910310 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-notification-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910341 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-log" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910349 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-log" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910369 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910375 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910381 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="proxy-httpd" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910387 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="proxy-httpd" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910406 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="sg-core" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910412 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="sg-core" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910435 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-metadata" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910440 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-metadata" Mar 14 05:55:25 crc kubenswrapper[4713]: E0314 05:55:25.910450 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-central-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910456 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-central-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910719 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="sg-core" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910733 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910753 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-metadata" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910762 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-notification-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910776 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="proxy-httpd" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910787 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" containerName="nova-metadata-log" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.910798 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" containerName="ceilometer-central-agent" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.912056 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.916376 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.916623 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.918473 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.966388 4713 scope.go:117] "RemoveContainer" containerID="d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d" Mar 14 05:55:25 crc kubenswrapper[4713]: I0314 05:55:25.984665 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.087428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.087503 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.087547 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdg5\" (UniqueName: \"kubernetes.io/projected/bbb8a72f-747e-4a4b-942c-3487e6c2e476-kube-api-access-7fdg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.087598 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.087671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.128091 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.131120 4713 scope.go:117] "RemoveContainer" containerID="b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337" Mar 14 05:55:26 crc kubenswrapper[4713]: E0314 05:55:26.133822 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337\": container with ID starting with b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337 not found: ID does not exist" containerID="b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.134006 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337"} err="failed to get container status \"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337\": rpc error: code = NotFound desc = could not find container \"b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337\": container with ID starting with b9239305d9bdc790b9c7c621a83e8b89e06a24fb727992c4e9de1598a7556337 not found: ID does not exist" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.134109 4713 scope.go:117] "RemoveContainer" containerID="d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d" Mar 14 05:55:26 crc kubenswrapper[4713]: E0314 05:55:26.135265 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d\": container with ID starting with d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d not found: ID does not exist" containerID="d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.135312 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d"} err="failed to get container status \"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d\": rpc error: code = NotFound desc = could not find container \"d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d\": container with ID starting with d8dc5b1d18ce26fbf27ea8d6e298e380a6906d42571339b353b3831c1f7cec1d not found: ID does not exist" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.135339 4713 scope.go:117] "RemoveContainer" containerID="ae62fdb00c3bc71d4e7614523fc39c4de372fedda5804c520382257ea1525335" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.155715 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.168520 4713 scope.go:117] "RemoveContainer" containerID="cda99b21bb4951367e8ce56606305f6c6194d1e1461985bddd8e84543cde7874" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.169336 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.181798 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.190059 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.190189 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.190291 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.190330 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdg5\" (UniqueName: \"kubernetes.io/projected/bbb8a72f-747e-4a4b-942c-3487e6c2e476-kube-api-access-7fdg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.190387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.195887 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.200492 4713 scope.go:117] "RemoveContainer" containerID="65425c6b60d661a087040d1e71fa8ed9e5a9378ab2832007f252858f6863405c" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.202706 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.203215 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.205914 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb8a72f-747e-4a4b-942c-3487e6c2e476-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.209253 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.211809 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.218503 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.220890 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.226837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdg5\" (UniqueName: \"kubernetes.io/projected/bbb8a72f-747e-4a4b-942c-3487e6c2e476-kube-api-access-7fdg5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbb8a72f-747e-4a4b-942c-3487e6c2e476\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.230313 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.244664 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.248930 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.251631 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.272822 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.277603 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.280191 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.280429 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.294786 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6c4\" (UniqueName: \"kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.294845 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.294922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.294943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295056 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295176 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htfql\" (UniqueName: \"kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295282 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295326 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295372 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295494 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295523 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295544 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295590 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295621 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbwz\" (UniqueName: \"kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.295778 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.300738 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.324035 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.404099 4713 scope.go:117] "RemoveContainer" containerID="ff3d108eb1989359ed3887ac833c162ed2e00584476e8cb0b15006618fd1b606" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.413000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.413076 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.413140 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.413180 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.414863 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.415627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.416753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417279 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417349 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417493 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbwz\" (UniqueName: \"kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417697 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417734 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6c4\" (UniqueName: \"kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.417956 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.418007 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.418061 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.418131 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htfql\" (UniqueName: \"kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.418254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.419041 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.419839 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.421430 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.421807 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.422130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.426527 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.426655 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.446786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.449754 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htfql\" (UniqueName: \"kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql\") pod \"dnsmasq-dns-f84f9ccf-jsqnh\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.449853 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.450225 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.451131 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.453974 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.455864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6c4\" (UniqueName: \"kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4\") pod \"nova-metadata-0\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.456374 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbwz\" (UniqueName: \"kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz\") pod \"ceilometer-0\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.463655 4713 scope.go:117] "RemoveContainer" containerID="ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.542581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.568538 4713 scope.go:117] "RemoveContainer" containerID="ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c" Mar 14 05:55:26 crc kubenswrapper[4713]: E0314 05:55:26.569173 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c\": container with ID starting with ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c not found: ID does not exist" containerID="ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.569227 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c"} err="failed to get container status \"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c\": rpc error: code = NotFound desc = could not find container \"ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c\": container with ID starting with ea5feab346807a2f39ac7b664e6ddaaf8e292e69922d5321d83206d3ebde154c not found: ID does not exist" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.707021 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.746713 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:26 crc kubenswrapper[4713]: I0314 05:55:26.880109 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 05:55:26 crc kubenswrapper[4713]: W0314 05:55:26.918856 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb8a72f_747e_4a4b_942c_3487e6c2e476.slice/crio-14b3a7b90ad45283c7d78240046363e7dbd3add623bcfa2d7b1b30b4c1d0a83c WatchSource:0}: Error finding container 14b3a7b90ad45283c7d78240046363e7dbd3add623bcfa2d7b1b30b4c1d0a83c: Status 404 returned error can't find the container with id 14b3a7b90ad45283c7d78240046363e7dbd3add623bcfa2d7b1b30b4c1d0a83c Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.299577 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.461697 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.623842 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f582f5f-e4b3-4a66-b366-4abfbbe100f1" path="/var/lib/kubelet/pods/6f582f5f-e4b3-4a66-b366-4abfbbe100f1/volumes" Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.627799 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b498d2-114c-44f2-b2d2-681d4bedbdbb" path="/var/lib/kubelet/pods/a5b498d2-114c-44f2-b2d2-681d4bedbdbb/volumes" Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.628922 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31f5798-c01e-427d-9db5-e955ae1b383b" path="/var/lib/kubelet/pods/e31f5798-c01e-427d-9db5-e955ae1b383b/volumes" Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.688267 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.783669 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbb8a72f-747e-4a4b-942c-3487e6c2e476","Type":"ContainerStarted","Data":"492f4f06e6071f0318aec98a146e8bce76db12b1c3faa1ab84a8921e127453d2"} Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.783935 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbb8a72f-747e-4a4b-942c-3487e6c2e476","Type":"ContainerStarted","Data":"14b3a7b90ad45283c7d78240046363e7dbd3add623bcfa2d7b1b30b4c1d0a83c"} Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.851549 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.851521429 podStartE2EDuration="2.851521429s" podCreationTimestamp="2026-03-14 05:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:27.809105629 +0000 UTC m=+1710.897014939" watchObservedRunningTime="2026-03-14 05:55:27.851521429 +0000 UTC m=+1710.939430729" Mar 14 05:55:27 crc kubenswrapper[4713]: I0314 05:55:27.917735 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.805971 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" event={"ID":"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e","Type":"ContainerDied","Data":"1bf287b51fd18f4b67b48a56e3407f95ef37a899b0fbd69c7ccfcda49c92ee81"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.805750 4713 generic.go:334] "Generic (PLEG): container finished" podID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerID="1bf287b51fd18f4b67b48a56e3407f95ef37a899b0fbd69c7ccfcda49c92ee81" exitCode=0 Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.807169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" event={"ID":"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e","Type":"ContainerStarted","Data":"7a2c98ca80c67defd8275d4577203a6c5b125e5c64d8681d6afa62356bce254c"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.813778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerStarted","Data":"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.813825 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerStarted","Data":"69d02b6ab0817efa0cee20757ef17dd68957386f3fe2b0ac4cceb104e9ff2d0c"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.819310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerStarted","Data":"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.835959 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerStarted","Data":"3764ae1af181ba0be89bfab90046d4136615a2cb3347388420d15d69c9b07f05"} Mar 14 05:55:28 crc kubenswrapper[4713]: I0314 05:55:28.835999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerStarted","Data":"a618b8f5d6343a287634458c0ff89d55b4d833d119ac577ea0f00cd02f453dc5"} Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.486347 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.487452 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-api" containerID="cri-o://27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1" gracePeriod=30 Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.487601 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-log" containerID="cri-o://16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e" gracePeriod=30 Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.851804 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerStarted","Data":"89dfe6b3081314697d43fece08c5ec9c83ad0c9e79cf16386926da53b558d228"} Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.855066 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" event={"ID":"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e","Type":"ContainerStarted","Data":"f10ec90eb7bd9f8c0cd0175132758dae5ee71b86eb3222ca1cdc231440cf8166"} Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.855335 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.859964 4713 generic.go:334] "Generic (PLEG): container finished" podID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerID="16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e" exitCode=143 Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.860028 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerDied","Data":"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e"} Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.862668 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerStarted","Data":"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511"} Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.891951 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" podStartSLOduration=3.891926179 podStartE2EDuration="3.891926179s" podCreationTimestamp="2026-03-14 05:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:29.879055383 +0000 UTC m=+1712.966964683" watchObservedRunningTime="2026-03-14 05:55:29.891926179 +0000 UTC m=+1712.979835479" Mar 14 05:55:29 crc kubenswrapper[4713]: I0314 05:55:29.911064 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.911038854 podStartE2EDuration="3.911038854s" podCreationTimestamp="2026-03-14 05:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:29.896181479 +0000 UTC m=+1712.984090779" watchObservedRunningTime="2026-03-14 05:55:29.911038854 +0000 UTC m=+1712.998948154" Mar 14 05:55:30 crc kubenswrapper[4713]: I0314 05:55:30.895265 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerStarted","Data":"8b3defe155af2ce3acd26995fa8925c1eea7a2928ce865f2a261f341c3a442d6"} Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.245897 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.910073 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerStarted","Data":"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f"} Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.910294 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-api" containerID="cri-o://705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6" gracePeriod=30 Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.910554 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-notifier" containerID="cri-o://8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566" gracePeriod=30 Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.910351 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-listener" containerID="cri-o://32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f" gracePeriod=30 Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.910515 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-evaluator" containerID="cri-o://8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf" gracePeriod=30 Mar 14 05:55:31 crc kubenswrapper[4713]: I0314 05:55:31.949822 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.125878318 podStartE2EDuration="11.949795193s" podCreationTimestamp="2026-03-14 05:55:20 +0000 UTC" firstStartedPulling="2026-03-14 05:55:21.425134253 +0000 UTC m=+1704.513043553" lastFinishedPulling="2026-03-14 05:55:31.249051128 +0000 UTC m=+1714.336960428" observedRunningTime="2026-03-14 05:55:31.935905587 +0000 UTC m=+1715.023814897" watchObservedRunningTime="2026-03-14 05:55:31.949795193 +0000 UTC m=+1715.037704493" Mar 14 05:55:32 crc kubenswrapper[4713]: E0314 05:55:32.872111 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf42f8799_d0e6_47c8_a847_53ea94d892b6.slice/crio-8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925583 4713 generic.go:334] "Generic (PLEG): container finished" podID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerID="8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566" exitCode=0 Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925613 4713 generic.go:334] "Generic (PLEG): container finished" podID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerID="8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf" exitCode=0 Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925622 4713 generic.go:334] "Generic (PLEG): container finished" podID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerID="705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6" exitCode=0 Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925642 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerDied","Data":"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566"} Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925668 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerDied","Data":"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf"} Mar 14 05:55:32 crc kubenswrapper[4713]: I0314 05:55:32.925677 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerDied","Data":"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6"} Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.574839 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.772092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fsqs\" (UniqueName: \"kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs\") pod \"af082c59-dad5-4273-b6f1-7f9bce22249a\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.772173 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs\") pod \"af082c59-dad5-4273-b6f1-7f9bce22249a\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.772466 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data\") pod \"af082c59-dad5-4273-b6f1-7f9bce22249a\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.772502 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle\") pod \"af082c59-dad5-4273-b6f1-7f9bce22249a\" (UID: \"af082c59-dad5-4273-b6f1-7f9bce22249a\") " Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.776164 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs" (OuterVolumeSpecName: "logs") pod "af082c59-dad5-4273-b6f1-7f9bce22249a" (UID: "af082c59-dad5-4273-b6f1-7f9bce22249a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.811058 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs" (OuterVolumeSpecName: "kube-api-access-9fsqs") pod "af082c59-dad5-4273-b6f1-7f9bce22249a" (UID: "af082c59-dad5-4273-b6f1-7f9bce22249a"). InnerVolumeSpecName "kube-api-access-9fsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.853657 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data" (OuterVolumeSpecName: "config-data") pod "af082c59-dad5-4273-b6f1-7f9bce22249a" (UID: "af082c59-dad5-4273-b6f1-7f9bce22249a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.867393 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af082c59-dad5-4273-b6f1-7f9bce22249a" (UID: "af082c59-dad5-4273-b6f1-7f9bce22249a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.888980 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.889033 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af082c59-dad5-4273-b6f1-7f9bce22249a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.889050 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fsqs\" (UniqueName: \"kubernetes.io/projected/af082c59-dad5-4273-b6f1-7f9bce22249a-kube-api-access-9fsqs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:33 crc kubenswrapper[4713]: I0314 05:55:33.889062 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af082c59-dad5-4273-b6f1-7f9bce22249a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.019725 4713 generic.go:334] "Generic (PLEG): container finished" podID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerID="27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1" exitCode=0 Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.019789 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerDied","Data":"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1"} Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.019819 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af082c59-dad5-4273-b6f1-7f9bce22249a","Type":"ContainerDied","Data":"bb454dfbaecb8563d942ab0a701f52a05794f5514d5984fb3e7d575e999b93b4"} Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.019856 4713 scope.go:117] "RemoveContainer" containerID="27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.020019 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.102552 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.125432 4713 scope.go:117] "RemoveContainer" containerID="16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.128422 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.150616 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:34 crc kubenswrapper[4713]: E0314 05:55:34.151189 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-log" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.151222 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-log" Mar 14 05:55:34 crc kubenswrapper[4713]: E0314 05:55:34.151272 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-api" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.151281 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-api" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.151555 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-log" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.151591 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" containerName="nova-api-api" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.152867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.161815 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.161815 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.161992 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.164881 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.223894 4713 scope.go:117] "RemoveContainer" containerID="27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1" Mar 14 05:55:34 crc kubenswrapper[4713]: E0314 05:55:34.227377 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1\": container with ID starting with 27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1 not found: ID does not exist" containerID="27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.227427 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1"} err="failed to get container status \"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1\": rpc error: code = NotFound desc = could not find container \"27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1\": container with ID starting with 27582b7d5d355b31035cbca799de10dfbaa54b848c5a9a3d4f500bc29bc7fac1 not found: ID does not exist" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.227461 4713 scope.go:117] "RemoveContainer" containerID="16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e" Mar 14 05:55:34 crc kubenswrapper[4713]: E0314 05:55:34.227832 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e\": container with ID starting with 16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e not found: ID does not exist" containerID="16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.227872 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e"} err="failed to get container status \"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e\": rpc error: code = NotFound desc = could not find container \"16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e\": container with ID starting with 16d2866bb365c2cb1f9aeedbe2f9d8614222687dc85c0eef1963ede2baa8e27e not found: ID does not exist" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.316913 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.317323 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.317718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.317758 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmr8m\" (UniqueName: \"kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.317833 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.317882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422043 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422186 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422312 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422330 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmr8m\" (UniqueName: \"kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422360 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.422385 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.425086 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.431022 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.431285 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.431398 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.432887 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.447368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmr8m\" (UniqueName: \"kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m\") pod \"nova-api-0\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " pod="openstack/nova-api-0" Mar 14 05:55:34 crc kubenswrapper[4713]: I0314 05:55:34.488797 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.025754 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:35 crc kubenswrapper[4713]: W0314 05:55:35.029926 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee07833e_520c_4757_ba0c_4b8232ef2258.slice/crio-0d296bdd309c6982f4b6295de87ffb32f585bb2b1743c618f4002e95a96c30c0 WatchSource:0}: Error finding container 0d296bdd309c6982f4b6295de87ffb32f585bb2b1743c618f4002e95a96c30c0: Status 404 returned error can't find the container with id 0d296bdd309c6982f4b6295de87ffb32f585bb2b1743c618f4002e95a96c30c0 Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.034673 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerStarted","Data":"4e39a3745d98fd2e2c4c33f967ae16aa87edf948fcbcd5f76b1699a2270821aa"} Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.034837 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-central-agent" containerID="cri-o://3764ae1af181ba0be89bfab90046d4136615a2cb3347388420d15d69c9b07f05" gracePeriod=30 Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.034923 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.035126 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="sg-core" containerID="cri-o://8b3defe155af2ce3acd26995fa8925c1eea7a2928ce865f2a261f341c3a442d6" gracePeriod=30 Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.035271 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-notification-agent" containerID="cri-o://89dfe6b3081314697d43fece08c5ec9c83ad0c9e79cf16386926da53b558d228" gracePeriod=30 Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.035278 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="proxy-httpd" containerID="cri-o://4e39a3745d98fd2e2c4c33f967ae16aa87edf948fcbcd5f76b1699a2270821aa" gracePeriod=30 Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.064360 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.056775182 podStartE2EDuration="9.064333535s" podCreationTimestamp="2026-03-14 05:55:26 +0000 UTC" firstStartedPulling="2026-03-14 05:55:27.808099938 +0000 UTC m=+1710.896009238" lastFinishedPulling="2026-03-14 05:55:33.815658291 +0000 UTC m=+1716.903567591" observedRunningTime="2026-03-14 05:55:35.064305664 +0000 UTC m=+1718.152214974" watchObservedRunningTime="2026-03-14 05:55:35.064333535 +0000 UTC m=+1718.152242855" Mar 14 05:55:35 crc kubenswrapper[4713]: I0314 05:55:35.580686 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af082c59-dad5-4273-b6f1-7f9bce22249a" path="/var/lib/kubelet/pods/af082c59-dad5-4273-b6f1-7f9bce22249a/volumes" Mar 14 05:55:35 crc kubenswrapper[4713]: E0314 05:55:35.906879 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.055647 4713 generic.go:334] "Generic (PLEG): container finished" podID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerID="4e39a3745d98fd2e2c4c33f967ae16aa87edf948fcbcd5f76b1699a2270821aa" exitCode=0 Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.056764 4713 generic.go:334] "Generic (PLEG): container finished" podID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerID="8b3defe155af2ce3acd26995fa8925c1eea7a2928ce865f2a261f341c3a442d6" exitCode=2 Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.056843 4713 generic.go:334] "Generic (PLEG): container finished" podID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerID="89dfe6b3081314697d43fece08c5ec9c83ad0c9e79cf16386926da53b558d228" exitCode=0 Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.055760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerDied","Data":"4e39a3745d98fd2e2c4c33f967ae16aa87edf948fcbcd5f76b1699a2270821aa"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.057019 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerDied","Data":"8b3defe155af2ce3acd26995fa8925c1eea7a2928ce865f2a261f341c3a442d6"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.057158 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerDied","Data":"89dfe6b3081314697d43fece08c5ec9c83ad0c9e79cf16386926da53b558d228"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.059057 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerStarted","Data":"cdc71ebc88e3bc2ca2f1a05f3b35723f940b423ec4e5cf9b0628c52e81b2838b"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.059102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerStarted","Data":"f13bff5e8e34ed71274e9654585efaabb80d2d586b5cdb12d127b8620f7fedd5"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.059120 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerStarted","Data":"0d296bdd309c6982f4b6295de87ffb32f585bb2b1743c618f4002e95a96c30c0"} Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.083091 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.0830663 podStartE2EDuration="2.0830663s" podCreationTimestamp="2026-03-14 05:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:36.080046327 +0000 UTC m=+1719.167955637" watchObservedRunningTime="2026-03-14 05:55:36.0830663 +0000 UTC m=+1719.170975600" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.250446 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.269588 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.545427 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.627043 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.628778 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="dnsmasq-dns" containerID="cri-o://1a74f361b0e3a1b45ed165c0ab7947b78e4fae621be9591340f7d6066644e6cd" gracePeriod=10 Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.707844 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 05:55:36 crc kubenswrapper[4713]: I0314 05:55:36.711950 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.081533 4713 generic.go:334] "Generic (PLEG): container finished" podID="0266f913-2a1b-401a-aaaa-720ece998a13" containerID="1a74f361b0e3a1b45ed165c0ab7947b78e4fae621be9591340f7d6066644e6cd" exitCode=0 Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.081617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" event={"ID":"0266f913-2a1b-401a-aaaa-720ece998a13","Type":"ContainerDied","Data":"1a74f361b0e3a1b45ed165c0ab7947b78e4fae621be9591340f7d6066644e6cd"} Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.085328 4713 generic.go:334] "Generic (PLEG): container finished" podID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerID="3764ae1af181ba0be89bfab90046d4136615a2cb3347388420d15d69c9b07f05" exitCode=0 Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.086694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerDied","Data":"3764ae1af181ba0be89bfab90046d4136615a2cb3347388420d15d69c9b07f05"} Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.086755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae0e401a-85dc-48a2-8696-deef7c83c0b4","Type":"ContainerDied","Data":"a618b8f5d6343a287634458c0ff89d55b4d833d119ac577ea0f00cd02f453dc5"} Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.086776 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a618b8f5d6343a287634458c0ff89d55b4d833d119ac577ea0f00cd02f453dc5" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.122987 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.232020 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330412 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330475 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330612 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbwz\" (UniqueName: \"kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.330860 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle\") pod \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\" (UID: \"ae0e401a-85dc-48a2-8696-deef7c83c0b4\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.340410 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.340447 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.340921 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts" (OuterVolumeSpecName: "scripts") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.346685 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz" (OuterVolumeSpecName: "kube-api-access-7hbwz") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "kube-api-access-7hbwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.395858 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.403971 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6ftqx"] Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404544 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="proxy-httpd" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404565 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="proxy-httpd" Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404615 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-notification-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404625 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-notification-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404640 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="sg-core" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404646 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="sg-core" Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404657 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="dnsmasq-dns" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404663 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="dnsmasq-dns" Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404672 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-central-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404678 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-central-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: E0314 05:55:37.404693 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="init" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404699 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="init" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404917 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-central-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404935 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="sg-core" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404947 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="proxy-httpd" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404965 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" containerName="dnsmasq-dns" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.404978 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" containerName="ceilometer-notification-agent" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.406133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.408401 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.408638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.415184 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ftqx"] Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.444653 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.444689 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.444700 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbwz\" (UniqueName: \"kubernetes.io/projected/ae0e401a-85dc-48a2-8696-deef7c83c0b4-kube-api-access-7hbwz\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.444709 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae0e401a-85dc-48a2-8696-deef7c83c0b4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.449722 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549306 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thnsk\" (UniqueName: \"kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549465 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549649 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549744 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549794 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.549969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb\") pod \"0266f913-2a1b-401a-aaaa-720ece998a13\" (UID: \"0266f913-2a1b-401a-aaaa-720ece998a13\") " Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.550292 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6r7\" (UniqueName: \"kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.550382 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.550484 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.551346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.551415 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.554346 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk" (OuterVolumeSpecName: "kube-api-access-thnsk") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "kube-api-access-thnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.646882 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.647753 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.653877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.655636 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6r7\" (UniqueName: \"kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.655747 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.655874 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.656081 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.656106 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thnsk\" (UniqueName: \"kubernetes.io/projected/0266f913-2a1b-401a-aaaa-720ece998a13-kube-api-access-thnsk\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.656124 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.660304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.673400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.675832 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.684581 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.689684 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.691804 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data" (OuterVolumeSpecName: "config-data") pod "ae0e401a-85dc-48a2-8696-deef7c83c0b4" (UID: "ae0e401a-85dc-48a2-8696-deef7c83c0b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.695069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config" (OuterVolumeSpecName: "config") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.699943 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6r7\" (UniqueName: \"kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7\") pod \"nova-cell1-cell-mapping-6ftqx\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.720984 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0266f913-2a1b-401a-aaaa-720ece998a13" (UID: "0266f913-2a1b-401a-aaaa-720ece998a13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.731254 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.5:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.747500 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.5:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.758872 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.758934 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.758945 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae0e401a-85dc-48a2-8696-deef7c83c0b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.758958 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.758968 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0266f913-2a1b-401a-aaaa-720ece998a13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:37 crc kubenswrapper[4713]: I0314 05:55:37.762443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.106955 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" event={"ID":"0266f913-2a1b-401a-aaaa-720ece998a13","Type":"ContainerDied","Data":"7af120b1fc2e7497573a36eb0f1a40367ecb0dee0e050950983259a536d10ddd"} Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.107305 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-gm7d5" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.107401 4713 scope.go:117] "RemoveContainer" containerID="1a74f361b0e3a1b45ed165c0ab7947b78e4fae621be9591340f7d6066644e6cd" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.110377 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.167260 4713 scope.go:117] "RemoveContainer" containerID="30193214a23d679629fd346997d39f9f43b3e83bebea357fed4b78c8ef86cf28" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.174635 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.197272 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-gm7d5"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.213293 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.233763 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.265637 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.270394 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.272707 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.273068 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.312169 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.365300 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ftqx"] Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395487 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395610 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395746 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395823 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wgtx\" (UniqueName: \"kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.395892 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498444 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498542 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498559 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498608 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498653 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wgtx\" (UniqueName: \"kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.498682 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.499159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.499226 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.504368 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.506826 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.506961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.507511 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.519654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wgtx\" (UniqueName: \"kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx\") pod \"ceilometer-0\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " pod="openstack/ceilometer-0" Mar 14 05:55:38 crc kubenswrapper[4713]: I0314 05:55:38.610223 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.124161 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ftqx" event={"ID":"fedb6b32-29d1-46af-a86f-1d96ebb1406d","Type":"ContainerStarted","Data":"7fe333034490b3791cc5827ef673cde354dd7a9b075e0742fc0ec26b4191d3f8"} Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.124617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ftqx" event={"ID":"fedb6b32-29d1-46af-a86f-1d96ebb1406d","Type":"ContainerStarted","Data":"41b1e1e9d490b6a70dd36f4e02bf3c16245fbbb547971d77d95c89a764b9e2fa"} Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.149789 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6ftqx" podStartSLOduration=2.149768776 podStartE2EDuration="2.149768776s" podCreationTimestamp="2026-03-14 05:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:39.146922959 +0000 UTC m=+1722.234832259" watchObservedRunningTime="2026-03-14 05:55:39.149768776 +0000 UTC m=+1722.237678076" Mar 14 05:55:39 crc kubenswrapper[4713]: W0314 05:55:39.194786 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cb61232_d986_4a6e_a52d_26ff98998087.slice/crio-611f585cb2d626fd143170c10a557cd3eca85f1d91ffc9fd9a290dcc1258a551 WatchSource:0}: Error finding container 611f585cb2d626fd143170c10a557cd3eca85f1d91ffc9fd9a290dcc1258a551: Status 404 returned error can't find the container with id 611f585cb2d626fd143170c10a557cd3eca85f1d91ffc9fd9a290dcc1258a551 Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.197119 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.578974 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0266f913-2a1b-401a-aaaa-720ece998a13" path="/var/lib/kubelet/pods/0266f913-2a1b-401a-aaaa-720ece998a13/volumes" Mar 14 05:55:39 crc kubenswrapper[4713]: I0314 05:55:39.580099 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0e401a-85dc-48a2-8696-deef7c83c0b4" path="/var/lib/kubelet/pods/ae0e401a-85dc-48a2-8696-deef7c83c0b4/volumes" Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.135974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerStarted","Data":"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97"} Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.136334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerStarted","Data":"611f585cb2d626fd143170c10a557cd3eca85f1d91ffc9fd9a290dcc1258a551"} Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.731670 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.732345 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.732413 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.733778 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 05:55:40 crc kubenswrapper[4713]: I0314 05:55:40.734056 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" gracePeriod=600 Mar 14 05:55:40 crc kubenswrapper[4713]: E0314 05:55:40.880859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:55:41 crc kubenswrapper[4713]: I0314 05:55:41.150648 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerStarted","Data":"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c"} Mar 14 05:55:41 crc kubenswrapper[4713]: I0314 05:55:41.154966 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" exitCode=0 Mar 14 05:55:41 crc kubenswrapper[4713]: I0314 05:55:41.155015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429"} Mar 14 05:55:41 crc kubenswrapper[4713]: I0314 05:55:41.155057 4713 scope.go:117] "RemoveContainer" containerID="540f96db525a7cfd501d37d526d1efc7f6c97b5c6c41b9d3d69eed7cce8a0419" Mar 14 05:55:41 crc kubenswrapper[4713]: I0314 05:55:41.156089 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:55:41 crc kubenswrapper[4713]: E0314 05:55:41.156603 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:55:42 crc kubenswrapper[4713]: I0314 05:55:42.173048 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerStarted","Data":"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925"} Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.196631 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerStarted","Data":"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1"} Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.198964 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.201996 4713 generic.go:334] "Generic (PLEG): container finished" podID="fedb6b32-29d1-46af-a86f-1d96ebb1406d" containerID="7fe333034490b3791cc5827ef673cde354dd7a9b075e0742fc0ec26b4191d3f8" exitCode=0 Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.202051 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ftqx" event={"ID":"fedb6b32-29d1-46af-a86f-1d96ebb1406d","Type":"ContainerDied","Data":"7fe333034490b3791cc5827ef673cde354dd7a9b075e0742fc0ec26b4191d3f8"} Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.230726 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.135976773 podStartE2EDuration="6.230701638s" podCreationTimestamp="2026-03-14 05:55:38 +0000 UTC" firstStartedPulling="2026-03-14 05:55:39.197782808 +0000 UTC m=+1722.285692108" lastFinishedPulling="2026-03-14 05:55:43.292507673 +0000 UTC m=+1726.380416973" observedRunningTime="2026-03-14 05:55:44.21774171 +0000 UTC m=+1727.305651030" watchObservedRunningTime="2026-03-14 05:55:44.230701638 +0000 UTC m=+1727.318610938" Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.489383 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.489772 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.707976 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:55:44 crc kubenswrapper[4713]: I0314 05:55:44.708607 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.502502 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.8:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.502535 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.8:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.737895 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.789427 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts\") pod \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.789665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data\") pod \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.789761 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q6r7\" (UniqueName: \"kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7\") pod \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.790430 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle\") pod \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\" (UID: \"fedb6b32-29d1-46af-a86f-1d96ebb1406d\") " Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.799613 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts" (OuterVolumeSpecName: "scripts") pod "fedb6b32-29d1-46af-a86f-1d96ebb1406d" (UID: "fedb6b32-29d1-46af-a86f-1d96ebb1406d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.800298 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7" (OuterVolumeSpecName: "kube-api-access-7q6r7") pod "fedb6b32-29d1-46af-a86f-1d96ebb1406d" (UID: "fedb6b32-29d1-46af-a86f-1d96ebb1406d"). InnerVolumeSpecName "kube-api-access-7q6r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.830530 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data" (OuterVolumeSpecName: "config-data") pod "fedb6b32-29d1-46af-a86f-1d96ebb1406d" (UID: "fedb6b32-29d1-46af-a86f-1d96ebb1406d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.853148 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fedb6b32-29d1-46af-a86f-1d96ebb1406d" (UID: "fedb6b32-29d1-46af-a86f-1d96ebb1406d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.895328 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.895522 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.895594 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q6r7\" (UniqueName: \"kubernetes.io/projected/fedb6b32-29d1-46af-a86f-1d96ebb1406d-kube-api-access-7q6r7\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:45 crc kubenswrapper[4713]: I0314 05:55:45.895658 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedb6b32-29d1-46af-a86f-1d96ebb1406d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:46 crc kubenswrapper[4713]: E0314 05:55:46.214575 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.227175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ftqx" event={"ID":"fedb6b32-29d1-46af-a86f-1d96ebb1406d","Type":"ContainerDied","Data":"41b1e1e9d490b6a70dd36f4e02bf3c16245fbbb547971d77d95c89a764b9e2fa"} Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.227237 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b1e1e9d490b6a70dd36f4e02bf3c16245fbbb547971d77d95c89a764b9e2fa" Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.227737 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ftqx" Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.428427 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.428846 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-log" containerID="cri-o://f13bff5e8e34ed71274e9654585efaabb80d2d586b5cdb12d127b8620f7fedd5" gracePeriod=30 Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.428936 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-api" containerID="cri-o://cdc71ebc88e3bc2ca2f1a05f3b35723f940b423ec4e5cf9b0628c52e81b2838b" gracePeriod=30 Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.453086 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.453425 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerName="nova-scheduler-scheduler" containerID="cri-o://c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" gracePeriod=30 Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.477568 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.477878 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-log" containerID="cri-o://eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3" gracePeriod=30 Mar 14 05:55:46 crc kubenswrapper[4713]: I0314 05:55:46.478984 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-metadata" containerID="cri-o://4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511" gracePeriod=30 Mar 14 05:55:47 crc kubenswrapper[4713]: I0314 05:55:47.242116 4713 generic.go:334] "Generic (PLEG): container finished" podID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerID="eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3" exitCode=143 Mar 14 05:55:47 crc kubenswrapper[4713]: I0314 05:55:47.242221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerDied","Data":"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3"} Mar 14 05:55:47 crc kubenswrapper[4713]: I0314 05:55:47.244605 4713 generic.go:334] "Generic (PLEG): container finished" podID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerID="f13bff5e8e34ed71274e9654585efaabb80d2d586b5cdb12d127b8620f7fedd5" exitCode=143 Mar 14 05:55:47 crc kubenswrapper[4713]: I0314 05:55:47.244643 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerDied","Data":"f13bff5e8e34ed71274e9654585efaabb80d2d586b5cdb12d127b8620f7fedd5"} Mar 14 05:55:47 crc kubenswrapper[4713]: E0314 05:55:47.608257 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.108367 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.109050 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.518021 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 is running failed: container process not found" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.519345 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 is running failed: container process not found" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.521660 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 is running failed: container process not found" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 05:55:48 crc kubenswrapper[4713]: E0314 05:55:48.521753 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerName="nova-scheduler-scheduler" Mar 14 05:55:48 crc kubenswrapper[4713]: I0314 05:55:48.922409 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:48 crc kubenswrapper[4713]: I0314 05:55:48.979183 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data\") pod \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " Mar 14 05:55:48 crc kubenswrapper[4713]: I0314 05:55:48.979310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle\") pod \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " Mar 14 05:55:48 crc kubenswrapper[4713]: I0314 05:55:48.979545 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6jrz\" (UniqueName: \"kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz\") pod \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\" (UID: \"63f5eb7b-a993-4198-9ed7-4b7522223fb7\") " Mar 14 05:55:48 crc kubenswrapper[4713]: I0314 05:55:48.988125 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz" (OuterVolumeSpecName: "kube-api-access-c6jrz") pod "63f5eb7b-a993-4198-9ed7-4b7522223fb7" (UID: "63f5eb7b-a993-4198-9ed7-4b7522223fb7"). InnerVolumeSpecName "kube-api-access-c6jrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.025788 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63f5eb7b-a993-4198-9ed7-4b7522223fb7" (UID: "63f5eb7b-a993-4198-9ed7-4b7522223fb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.028990 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data" (OuterVolumeSpecName: "config-data") pod "63f5eb7b-a993-4198-9ed7-4b7522223fb7" (UID: "63f5eb7b-a993-4198-9ed7-4b7522223fb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.081926 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.081976 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f5eb7b-a993-4198-9ed7-4b7522223fb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.082004 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6jrz\" (UniqueName: \"kubernetes.io/projected/63f5eb7b-a993-4198-9ed7-4b7522223fb7-kube-api-access-c6jrz\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.273888 4713 generic.go:334] "Generic (PLEG): container finished" podID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" exitCode=0 Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.273972 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.273994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63f5eb7b-a993-4198-9ed7-4b7522223fb7","Type":"ContainerDied","Data":"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85"} Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.274590 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63f5eb7b-a993-4198-9ed7-4b7522223fb7","Type":"ContainerDied","Data":"8592a757bdd2c58cc8fb38b13efa42561cfd7bf25e1c61206c5893f2f1f28052"} Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.274613 4713 scope.go:117] "RemoveContainer" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.324958 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.344245 4713 scope.go:117] "RemoveContainer" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" Mar 14 05:55:49 crc kubenswrapper[4713]: E0314 05:55:49.344874 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85\": container with ID starting with c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 not found: ID does not exist" containerID="c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.344921 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85"} err="failed to get container status \"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85\": rpc error: code = NotFound desc = could not find container \"c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85\": container with ID starting with c1868c444022af04dbb12f9f2934a04572b7607befd7331431d4dccdc4afed85 not found: ID does not exist" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.344978 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.370973 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:49 crc kubenswrapper[4713]: E0314 05:55:49.371903 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedb6b32-29d1-46af-a86f-1d96ebb1406d" containerName="nova-manage" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.371939 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedb6b32-29d1-46af-a86f-1d96ebb1406d" containerName="nova-manage" Mar 14 05:55:49 crc kubenswrapper[4713]: E0314 05:55:49.372019 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerName="nova-scheduler-scheduler" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.372034 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerName="nova-scheduler-scheduler" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.372335 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" containerName="nova-scheduler-scheduler" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.372354 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedb6b32-29d1-46af-a86f-1d96ebb1406d" containerName="nova-manage" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.379617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.382021 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.386888 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.498715 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kb69\" (UniqueName: \"kubernetes.io/projected/d520c464-934b-4fee-b00c-e4f227de360e-kube-api-access-9kb69\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.499116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-config-data\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.499641 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.579381 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f5eb7b-a993-4198-9ed7-4b7522223fb7" path="/var/lib/kubelet/pods/63f5eb7b-a993-4198-9ed7-4b7522223fb7/volumes" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.602386 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-config-data\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.602552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.602693 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kb69\" (UniqueName: \"kubernetes.io/projected/d520c464-934b-4fee-b00c-e4f227de360e-kube-api-access-9kb69\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.621147 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-config-data\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.626982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d520c464-934b-4fee-b00c-e4f227de360e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.633773 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kb69\" (UniqueName: \"kubernetes.io/projected/d520c464-934b-4fee-b00c-e4f227de360e-kube-api-access-9kb69\") pod \"nova-scheduler-0\" (UID: \"d520c464-934b-4fee-b00c-e4f227de360e\") " pod="openstack/nova-scheduler-0" Mar 14 05:55:49 crc kubenswrapper[4713]: I0314 05:55:49.707652 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.178452 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.237588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle\") pod \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.238003 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh6c4\" (UniqueName: \"kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4\") pod \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.238099 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data\") pod \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.238178 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs\") pod \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.238297 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs\") pod \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\" (UID: \"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe\") " Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.239281 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs" (OuterVolumeSpecName: "logs") pod "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" (UID: "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.245323 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4" (OuterVolumeSpecName: "kube-api-access-rh6c4") pod "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" (UID: "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe"). InnerVolumeSpecName "kube-api-access-rh6c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.279495 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" (UID: "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.302249 4713 generic.go:334] "Generic (PLEG): container finished" podID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerID="4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511" exitCode=0 Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.302426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerDied","Data":"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511"} Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.302488 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8db1851a-ca1f-49cd-bfe9-ce42e3e832fe","Type":"ContainerDied","Data":"69d02b6ab0817efa0cee20757ef17dd68957386f3fe2b0ac4cceb104e9ff2d0c"} Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.302511 4713 scope.go:117] "RemoveContainer" containerID="4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.302514 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.306052 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data" (OuterVolumeSpecName: "config-data") pod "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" (UID: "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.314833 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.322081 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" (UID: "8db1851a-ca1f-49cd-bfe9-ce42e3e832fe"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.342311 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.342409 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.342470 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh6c4\" (UniqueName: \"kubernetes.io/projected/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-kube-api-access-rh6c4\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.342540 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.342605 4713 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.347869 4713 scope.go:117] "RemoveContainer" containerID="eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.393768 4713 scope.go:117] "RemoveContainer" containerID="4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511" Mar 14 05:55:50 crc kubenswrapper[4713]: E0314 05:55:50.395324 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511\": container with ID starting with 4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511 not found: ID does not exist" containerID="4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.395376 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511"} err="failed to get container status \"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511\": rpc error: code = NotFound desc = could not find container \"4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511\": container with ID starting with 4fb244378c7697cd0b5e8b130455d4f74b134df994a64ec28fb17bf010be9511 not found: ID does not exist" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.395408 4713 scope.go:117] "RemoveContainer" containerID="eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3" Mar 14 05:55:50 crc kubenswrapper[4713]: E0314 05:55:50.395817 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3\": container with ID starting with eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3 not found: ID does not exist" containerID="eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.395876 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3"} err="failed to get container status \"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3\": rpc error: code = NotFound desc = could not find container \"eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3\": container with ID starting with eec4c4cafe4ed034d46770755ade42822626d592945d47f87e4fd6f714bca8a3 not found: ID does not exist" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.728241 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.748889 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.768541 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:50 crc kubenswrapper[4713]: E0314 05:55:50.770015 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-log" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.770044 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-log" Mar 14 05:55:50 crc kubenswrapper[4713]: E0314 05:55:50.770117 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-metadata" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.770128 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-metadata" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.770673 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-log" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.770745 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" containerName="nova-metadata-metadata" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.772726 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.779664 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.780014 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.782536 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.856692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.856788 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6019f4ac-3776-409b-ba3c-64d1739791a7-logs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.856819 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.856957 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfb2k\" (UniqueName: \"kubernetes.io/projected/6019f4ac-3776-409b-ba3c-64d1739791a7-kube-api-access-dfb2k\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.857047 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-config-data\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.960258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfb2k\" (UniqueName: \"kubernetes.io/projected/6019f4ac-3776-409b-ba3c-64d1739791a7-kube-api-access-dfb2k\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.960399 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-config-data\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.960479 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.960554 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6019f4ac-3776-409b-ba3c-64d1739791a7-logs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.960576 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.961225 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6019f4ac-3776-409b-ba3c-64d1739791a7-logs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.965730 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-config-data\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.965794 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.966855 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019f4ac-3776-409b-ba3c-64d1739791a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:50 crc kubenswrapper[4713]: I0314 05:55:50.980775 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfb2k\" (UniqueName: \"kubernetes.io/projected/6019f4ac-3776-409b-ba3c-64d1739791a7-kube-api-access-dfb2k\") pod \"nova-metadata-0\" (UID: \"6019f4ac-3776-409b-ba3c-64d1739791a7\") " pod="openstack/nova-metadata-0" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.100174 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.324047 4713 generic.go:334] "Generic (PLEG): container finished" podID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerID="cdc71ebc88e3bc2ca2f1a05f3b35723f940b423ec4e5cf9b0628c52e81b2838b" exitCode=0 Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.324453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerDied","Data":"cdc71ebc88e3bc2ca2f1a05f3b35723f940b423ec4e5cf9b0628c52e81b2838b"} Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.327052 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d520c464-934b-4fee-b00c-e4f227de360e","Type":"ContainerStarted","Data":"3a0f0f34602e9e91f47e34d9fc7accc41237bef72b15f81931a545c664a4a758"} Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.327141 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d520c464-934b-4fee-b00c-e4f227de360e","Type":"ContainerStarted","Data":"676f30597e814eaecfcca5728a4edf94f2d2fbbe99800ef699f09360f4604655"} Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.363927 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.363903924 podStartE2EDuration="2.363903924s" podCreationTimestamp="2026-03-14 05:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:51.355910859 +0000 UTC m=+1734.443820159" watchObservedRunningTime="2026-03-14 05:55:51.363903924 +0000 UTC m=+1734.451813224" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.388762 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.490049 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.490139 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.490324 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmr8m\" (UniqueName: \"kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.490356 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.491014 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.490640 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs" (OuterVolumeSpecName: "logs") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.491579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data\") pod \"ee07833e-520c-4757-ba0c-4b8232ef2258\" (UID: \"ee07833e-520c-4757-ba0c-4b8232ef2258\") " Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.492226 4713 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee07833e-520c-4757-ba0c-4b8232ef2258-logs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.495671 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m" (OuterVolumeSpecName: "kube-api-access-wmr8m") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "kube-api-access-wmr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.538339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.555515 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data" (OuterVolumeSpecName: "config-data") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.558880 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.568024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ee07833e-520c-4757-ba0c-4b8232ef2258" (UID: "ee07833e-520c-4757-ba0c-4b8232ef2258"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.580949 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db1851a-ca1f-49cd-bfe9-ce42e3e832fe" path="/var/lib/kubelet/pods/8db1851a-ca1f-49cd-bfe9-ce42e3e832fe/volumes" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.594640 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.594832 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.594901 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmr8m\" (UniqueName: \"kubernetes.io/projected/ee07833e-520c-4757-ba0c-4b8232ef2258-kube-api-access-wmr8m\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.594957 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.595017 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee07833e-520c-4757-ba0c-4b8232ef2258-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:55:51 crc kubenswrapper[4713]: I0314 05:55:51.704347 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 05:55:51 crc kubenswrapper[4713]: W0314 05:55:51.704948 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6019f4ac_3776_409b_ba3c_64d1739791a7.slice/crio-baae9101fc66dc80c736e098ebb525b2aeaa6b04dedd1e67db8844f15b3ed526 WatchSource:0}: Error finding container baae9101fc66dc80c736e098ebb525b2aeaa6b04dedd1e67db8844f15b3ed526: Status 404 returned error can't find the container with id baae9101fc66dc80c736e098ebb525b2aeaa6b04dedd1e67db8844f15b3ed526 Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.364002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee07833e-520c-4757-ba0c-4b8232ef2258","Type":"ContainerDied","Data":"0d296bdd309c6982f4b6295de87ffb32f585bb2b1743c618f4002e95a96c30c0"} Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.364355 4713 scope.go:117] "RemoveContainer" containerID="cdc71ebc88e3bc2ca2f1a05f3b35723f940b423ec4e5cf9b0628c52e81b2838b" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.364491 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.380896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6019f4ac-3776-409b-ba3c-64d1739791a7","Type":"ContainerStarted","Data":"724a5be108dc57fbf583a888904c44abbd99d92883c8899f9b3c05fee0625655"} Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.381152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6019f4ac-3776-409b-ba3c-64d1739791a7","Type":"ContainerStarted","Data":"6a92c8f160dac5e9527157cc7320703a08e5662b2fa1429b0cf5eca150077b45"} Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.381170 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6019f4ac-3776-409b-ba3c-64d1739791a7","Type":"ContainerStarted","Data":"baae9101fc66dc80c736e098ebb525b2aeaa6b04dedd1e67db8844f15b3ed526"} Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.407402 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.407552 4713 scope.go:117] "RemoveContainer" containerID="f13bff5e8e34ed71274e9654585efaabb80d2d586b5cdb12d127b8620f7fedd5" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.425249 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.432352 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.432319782 podStartE2EDuration="2.432319782s" podCreationTimestamp="2026-03-14 05:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:52.414085883 +0000 UTC m=+1735.501995203" watchObservedRunningTime="2026-03-14 05:55:52.432319782 +0000 UTC m=+1735.520229092" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.477304 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:52 crc kubenswrapper[4713]: E0314 05:55:52.478511 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-api" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.478607 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-api" Mar 14 05:55:52 crc kubenswrapper[4713]: E0314 05:55:52.478736 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-log" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.478825 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-log" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.479250 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-log" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.479536 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" containerName="nova-api-api" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.481830 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.484179 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.487853 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.489637 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.489989 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626276 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-logs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626331 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626372 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626625 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-config-data\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626762 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4x74\" (UniqueName: \"kubernetes.io/projected/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-kube-api-access-m4x74\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.626828 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.728926 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-logs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729301 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729381 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729449 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-logs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729638 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-config-data\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4x74\" (UniqueName: \"kubernetes.io/projected/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-kube-api-access-m4x74\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.729917 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.733545 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.734059 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.734331 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-config-data\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.735282 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.748532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4x74\" (UniqueName: \"kubernetes.io/projected/f0f4fde5-7672-47ca-9935-b0d5124f5b2d-kube-api-access-m4x74\") pod \"nova-api-0\" (UID: \"f0f4fde5-7672-47ca-9935-b0d5124f5b2d\") " pod="openstack/nova-api-0" Mar 14 05:55:52 crc kubenswrapper[4713]: I0314 05:55:52.825848 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 05:55:53 crc kubenswrapper[4713]: W0314 05:55:53.351716 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f4fde5_7672_47ca_9935_b0d5124f5b2d.slice/crio-a0e82703376bbcf771e6d27227b93b50d02ef100a49fc84334e7a04750443813 WatchSource:0}: Error finding container a0e82703376bbcf771e6d27227b93b50d02ef100a49fc84334e7a04750443813: Status 404 returned error can't find the container with id a0e82703376bbcf771e6d27227b93b50d02ef100a49fc84334e7a04750443813 Mar 14 05:55:53 crc kubenswrapper[4713]: I0314 05:55:53.352256 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 05:55:53 crc kubenswrapper[4713]: I0314 05:55:53.393889 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0f4fde5-7672-47ca-9935-b0d5124f5b2d","Type":"ContainerStarted","Data":"a0e82703376bbcf771e6d27227b93b50d02ef100a49fc84334e7a04750443813"} Mar 14 05:55:53 crc kubenswrapper[4713]: I0314 05:55:53.588797 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee07833e-520c-4757-ba0c-4b8232ef2258" path="/var/lib/kubelet/pods/ee07833e-520c-4757-ba0c-4b8232ef2258/volumes" Mar 14 05:55:54 crc kubenswrapper[4713]: I0314 05:55:54.598942 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0f4fde5-7672-47ca-9935-b0d5124f5b2d","Type":"ContainerStarted","Data":"3e05f8ca16a7b6c3562c5884f5bfda5d1e490312316b07e10e0e20f98985ac59"} Mar 14 05:55:54 crc kubenswrapper[4713]: I0314 05:55:54.602106 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0f4fde5-7672-47ca-9935-b0d5124f5b2d","Type":"ContainerStarted","Data":"6541f6ac1ad83166dc7041b38d340c679649bd18e437165f5a1ae95f7c8a156a"} Mar 14 05:55:54 crc kubenswrapper[4713]: I0314 05:55:54.657530 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.657508526 podStartE2EDuration="2.657508526s" podCreationTimestamp="2026-03-14 05:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:55:54.656090413 +0000 UTC m=+1737.743999713" watchObservedRunningTime="2026-03-14 05:55:54.657508526 +0000 UTC m=+1737.745417826" Mar 14 05:55:54 crc kubenswrapper[4713]: I0314 05:55:54.709050 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 05:55:55 crc kubenswrapper[4713]: I0314 05:55:55.564797 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:55:55 crc kubenswrapper[4713]: E0314 05:55:55.565319 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:55:56 crc kubenswrapper[4713]: E0314 05:55:56.509011 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13495e5d_b3b5_46aa_8f15_4f6f8d4e85d3.slice\": RecentStats: unable to find data in memory cache]" Mar 14 05:55:59 crc kubenswrapper[4713]: I0314 05:55:59.709433 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 05:55:59 crc kubenswrapper[4713]: I0314 05:55:59.741809 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.147102 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557796-v46qf"] Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.149432 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.152380 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.152389 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.152904 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.162041 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557796-v46qf"] Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.174828 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg5bn\" (UniqueName: \"kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn\") pod \"auto-csr-approver-29557796-v46qf\" (UID: \"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79\") " pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.277838 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg5bn\" (UniqueName: \"kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn\") pod \"auto-csr-approver-29557796-v46qf\" (UID: \"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79\") " pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.305961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg5bn\" (UniqueName: \"kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn\") pod \"auto-csr-approver-29557796-v46qf\" (UID: \"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79\") " pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.471987 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.713739 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 05:56:00 crc kubenswrapper[4713]: I0314 05:56:00.991882 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557796-v46qf"] Mar 14 05:56:01 crc kubenswrapper[4713]: W0314 05:56:01.009522 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12e0467d_86c8_4ec7_af58_a2e9f3c4dd79.slice/crio-f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf WatchSource:0}: Error finding container f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf: Status 404 returned error can't find the container with id f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf Mar 14 05:56:01 crc kubenswrapper[4713]: I0314 05:56:01.101263 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 05:56:01 crc kubenswrapper[4713]: I0314 05:56:01.101587 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 05:56:01 crc kubenswrapper[4713]: I0314 05:56:01.683758 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557796-v46qf" event={"ID":"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79","Type":"ContainerStarted","Data":"f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf"} Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.114398 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6019f4ac-3776-409b-ba3c-64d1739791a7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.12:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.114437 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6019f4ac-3776-409b-ba3c-64d1739791a7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.12:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.440785 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.543916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle\") pod \"f42f8799-d0e6-47c8-a847-53ea94d892b6\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.544083 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wnf\" (UniqueName: \"kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf\") pod \"f42f8799-d0e6-47c8-a847-53ea94d892b6\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.544134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data\") pod \"f42f8799-d0e6-47c8-a847-53ea94d892b6\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.544244 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts\") pod \"f42f8799-d0e6-47c8-a847-53ea94d892b6\" (UID: \"f42f8799-d0e6-47c8-a847-53ea94d892b6\") " Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.551339 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts" (OuterVolumeSpecName: "scripts") pod "f42f8799-d0e6-47c8-a847-53ea94d892b6" (UID: "f42f8799-d0e6-47c8-a847-53ea94d892b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.564578 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf" (OuterVolumeSpecName: "kube-api-access-d8wnf") pod "f42f8799-d0e6-47c8-a847-53ea94d892b6" (UID: "f42f8799-d0e6-47c8-a847-53ea94d892b6"). InnerVolumeSpecName "kube-api-access-d8wnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.650433 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wnf\" (UniqueName: \"kubernetes.io/projected/f42f8799-d0e6-47c8-a847-53ea94d892b6-kube-api-access-d8wnf\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.650721 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.701025 4713 generic.go:334] "Generic (PLEG): container finished" podID="12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" containerID="17266f7dc0dea3431c2c76e2db30536c71b44274e4700047f49fad0a009b00e9" exitCode=0 Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.701110 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557796-v46qf" event={"ID":"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79","Type":"ContainerDied","Data":"17266f7dc0dea3431c2c76e2db30536c71b44274e4700047f49fad0a009b00e9"} Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.707335 4713 generic.go:334] "Generic (PLEG): container finished" podID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerID="32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f" exitCode=137 Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.707390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerDied","Data":"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f"} Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.707422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f42f8799-d0e6-47c8-a847-53ea94d892b6","Type":"ContainerDied","Data":"da33f2b3234f85003641b9627b52457ef2bd5465293cc0ea5ae142244892f29e"} Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.707439 4713 scope.go:117] "RemoveContainer" containerID="32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.707584 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:56:02 crc kubenswrapper[4713]: E0314 05:56:02.724777 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12e0467d_86c8_4ec7_af58_a2e9f3c4dd79.slice/crio-17266f7dc0dea3431c2c76e2db30536c71b44274e4700047f49fad0a009b00e9.scope\": RecentStats: unable to find data in memory cache]" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.753531 4713 scope.go:117] "RemoveContainer" containerID="8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.763798 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f42f8799-d0e6-47c8-a847-53ea94d892b6" (UID: "f42f8799-d0e6-47c8-a847-53ea94d892b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.777293 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data" (OuterVolumeSpecName: "config-data") pod "f42f8799-d0e6-47c8-a847-53ea94d892b6" (UID: "f42f8799-d0e6-47c8-a847-53ea94d892b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.779052 4713 scope.go:117] "RemoveContainer" containerID="8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.811183 4713 scope.go:117] "RemoveContainer" containerID="705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.826600 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.826675 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.843326 4713 scope.go:117] "RemoveContainer" containerID="32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f" Mar 14 05:56:02 crc kubenswrapper[4713]: E0314 05:56:02.844382 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f\": container with ID starting with 32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f not found: ID does not exist" containerID="32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.844440 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f"} err="failed to get container status \"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f\": rpc error: code = NotFound desc = could not find container \"32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f\": container with ID starting with 32e3103be541cd57c2783fd4ab939b0f8136aa8654f1a2198d24dd999e0aec3f not found: ID does not exist" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.844474 4713 scope.go:117] "RemoveContainer" containerID="8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566" Mar 14 05:56:02 crc kubenswrapper[4713]: E0314 05:56:02.844981 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566\": container with ID starting with 8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566 not found: ID does not exist" containerID="8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.845013 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566"} err="failed to get container status \"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566\": rpc error: code = NotFound desc = could not find container \"8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566\": container with ID starting with 8c8342c455cd1193ec9ae9002d515e8e21564cb3f72e7d164f1917b06f8f7566 not found: ID does not exist" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.845035 4713 scope.go:117] "RemoveContainer" containerID="8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf" Mar 14 05:56:02 crc kubenswrapper[4713]: E0314 05:56:02.845356 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf\": container with ID starting with 8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf not found: ID does not exist" containerID="8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.845406 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf"} err="failed to get container status \"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf\": rpc error: code = NotFound desc = could not find container \"8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf\": container with ID starting with 8d4e57d228f1715b21224cded270a4f2e1d1738aa17fd0d6bbaf0addd9599acf not found: ID does not exist" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.845425 4713 scope.go:117] "RemoveContainer" containerID="705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6" Mar 14 05:56:02 crc kubenswrapper[4713]: E0314 05:56:02.845842 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6\": container with ID starting with 705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6 not found: ID does not exist" containerID="705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.845878 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6"} err="failed to get container status \"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6\": rpc error: code = NotFound desc = could not find container \"705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6\": container with ID starting with 705b761d594094b117379577c76d2b1ced86e4243724169a9f1fceec2bf83ee6 not found: ID does not exist" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.854832 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:02 crc kubenswrapper[4713]: I0314 05:56:02.854879 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42f8799-d0e6-47c8-a847-53ea94d892b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.076470 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.089788 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.115005 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 14 05:56:03 crc kubenswrapper[4713]: E0314 05:56:03.115694 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-notifier" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.115709 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-notifier" Mar 14 05:56:03 crc kubenswrapper[4713]: E0314 05:56:03.115752 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-api" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.115758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-api" Mar 14 05:56:03 crc kubenswrapper[4713]: E0314 05:56:03.115778 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-evaluator" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.115784 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-evaluator" Mar 14 05:56:03 crc kubenswrapper[4713]: E0314 05:56:03.115792 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-listener" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.115797 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-listener" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.116047 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-listener" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.116084 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-evaluator" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.116101 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-api" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.116116 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" containerName="aodh-notifier" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.118831 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.124881 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.125186 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.125484 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2n7nm" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.125676 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.125832 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.140426 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273322 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273493 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxf4\" (UniqueName: \"kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273537 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273570 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.273746 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.376663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.377063 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxf4\" (UniqueName: \"kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.377102 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.377128 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.377178 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.377291 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.382444 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.383132 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.388829 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.389128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.398930 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxf4\" (UniqueName: \"kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.405261 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.497619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.635736 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42f8799-d0e6-47c8-a847-53ea94d892b6" path="/var/lib/kubelet/pods/f42f8799-d0e6-47c8-a847-53ea94d892b6/volumes" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.876365 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f0f4fde5-7672-47ca-9935-b0d5124f5b2d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:56:03 crc kubenswrapper[4713]: I0314 05:56:03.876553 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f0f4fde5-7672-47ca-9935-b0d5124f5b2d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.059609 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.249637 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.425764 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg5bn\" (UniqueName: \"kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn\") pod \"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79\" (UID: \"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79\") " Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.432046 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn" (OuterVolumeSpecName: "kube-api-access-fg5bn") pod "12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" (UID: "12e0467d-86c8-4ec7-af58-a2e9f3c4dd79"). InnerVolumeSpecName "kube-api-access-fg5bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.528810 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg5bn\" (UniqueName: \"kubernetes.io/projected/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79-kube-api-access-fg5bn\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.803828 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557796-v46qf" event={"ID":"12e0467d-86c8-4ec7-af58-a2e9f3c4dd79","Type":"ContainerDied","Data":"f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf"} Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.804267 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b6781a480b97dfac53e47706ac61444d37d6a619188142239076cf161dcaaf" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.804366 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557796-v46qf" Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.810280 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerStarted","Data":"8cc1027dd9214f8fb90bd7bfc366e201cdc93cfceadd0588fd52f884f9dba7d2"} Mar 14 05:56:04 crc kubenswrapper[4713]: I0314 05:56:04.810342 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerStarted","Data":"39547d820722c0c30217aa7ae9c2a722a14049e6069bc445cfe034931a6125ce"} Mar 14 05:56:05 crc kubenswrapper[4713]: I0314 05:56:05.330818 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557790-mlscf"] Mar 14 05:56:05 crc kubenswrapper[4713]: I0314 05:56:05.346840 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557790-mlscf"] Mar 14 05:56:05 crc kubenswrapper[4713]: I0314 05:56:05.580962 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3b651c-ff92-4369-9bec-522a5c7c9aba" path="/var/lib/kubelet/pods/ac3b651c-ff92-4369-9bec-522a5c7c9aba/volumes" Mar 14 05:56:05 crc kubenswrapper[4713]: I0314 05:56:05.826707 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerStarted","Data":"64e7c847a0c6a09878a30dc88abdff28469857d22754a37ef3fd692f63cda61d"} Mar 14 05:56:06 crc kubenswrapper[4713]: I0314 05:56:06.565735 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:56:06 crc kubenswrapper[4713]: E0314 05:56:06.566811 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:56:06 crc kubenswrapper[4713]: I0314 05:56:06.839719 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerStarted","Data":"b01489335a9f4953f3cf8fce2d42cb7094033bbe51ce9acf26a53f8be2847286"} Mar 14 05:56:07 crc kubenswrapper[4713]: I0314 05:56:07.869399 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerStarted","Data":"e9c7f7748f525e49b7b2ee98aa1f7bd3ec75aab6eedc2b06c418287d3545e11c"} Mar 14 05:56:07 crc kubenswrapper[4713]: I0314 05:56:07.902412 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.404364857 podStartE2EDuration="4.902379668s" podCreationTimestamp="2026-03-14 05:56:03 +0000 UTC" firstStartedPulling="2026-03-14 05:56:04.075574007 +0000 UTC m=+1747.163483307" lastFinishedPulling="2026-03-14 05:56:07.573588818 +0000 UTC m=+1750.661498118" observedRunningTime="2026-03-14 05:56:07.891469563 +0000 UTC m=+1750.979378873" watchObservedRunningTime="2026-03-14 05:56:07.902379668 +0000 UTC m=+1750.990288968" Mar 14 05:56:08 crc kubenswrapper[4713]: I0314 05:56:08.630736 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 05:56:09 crc kubenswrapper[4713]: I0314 05:56:09.102140 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:56:09 crc kubenswrapper[4713]: I0314 05:56:09.102469 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 05:56:10 crc kubenswrapper[4713]: I0314 05:56:10.826558 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:56:10 crc kubenswrapper[4713]: I0314 05:56:10.826867 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 05:56:11 crc kubenswrapper[4713]: I0314 05:56:11.114122 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 05:56:11 crc kubenswrapper[4713]: I0314 05:56:11.114701 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 05:56:11 crc kubenswrapper[4713]: I0314 05:56:11.122767 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 05:56:11 crc kubenswrapper[4713]: I0314 05:56:11.123508 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.835977 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.836953 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.844689 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.935163 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.983270 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:12 crc kubenswrapper[4713]: I0314 05:56:12.983573 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f04b986f-f5da-4458-93bc-c093c0f8a24b" containerName="kube-state-metrics" containerID="cri-o://d72529fc4cdf24bdd4a9ebde5d7efc3d516026012343812cdcb517c23427bba3" gracePeriod=30 Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.267682 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.268343 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" containerName="mysqld-exporter" containerID="cri-o://4633a409ec3408915579cb9121db32d93190d9d003e26d9e8c2692fcccc85444" gracePeriod=30 Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.949087 4713 generic.go:334] "Generic (PLEG): container finished" podID="f04b986f-f5da-4458-93bc-c093c0f8a24b" containerID="d72529fc4cdf24bdd4a9ebde5d7efc3d516026012343812cdcb517c23427bba3" exitCode=2 Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.949400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f04b986f-f5da-4458-93bc-c093c0f8a24b","Type":"ContainerDied","Data":"d72529fc4cdf24bdd4a9ebde5d7efc3d516026012343812cdcb517c23427bba3"} Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.949430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f04b986f-f5da-4458-93bc-c093c0f8a24b","Type":"ContainerDied","Data":"7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490"} Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.949442 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ce09c6b1b05b653eae10831cf0ce24371bbfd43eb71f7cf7121c37c7aee8490" Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.958862 4713 generic.go:334] "Generic (PLEG): container finished" podID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" containerID="4633a409ec3408915579cb9121db32d93190d9d003e26d9e8c2692fcccc85444" exitCode=2 Mar 14 05:56:13 crc kubenswrapper[4713]: I0314 05:56:13.959837 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6590feb6-7a54-4dcb-8656-60f55d65a5f2","Type":"ContainerDied","Data":"4633a409ec3408915579cb9121db32d93190d9d003e26d9e8c2692fcccc85444"} Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.128874 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.141841 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.223142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwgkg\" (UniqueName: \"kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg\") pod \"f04b986f-f5da-4458-93bc-c093c0f8a24b\" (UID: \"f04b986f-f5da-4458-93bc-c093c0f8a24b\") " Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.223302 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data\") pod \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.223574 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle\") pod \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.223661 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klgtp\" (UniqueName: \"kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp\") pod \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\" (UID: \"6590feb6-7a54-4dcb-8656-60f55d65a5f2\") " Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.232504 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp" (OuterVolumeSpecName: "kube-api-access-klgtp") pod "6590feb6-7a54-4dcb-8656-60f55d65a5f2" (UID: "6590feb6-7a54-4dcb-8656-60f55d65a5f2"). InnerVolumeSpecName "kube-api-access-klgtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.232954 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg" (OuterVolumeSpecName: "kube-api-access-xwgkg") pod "f04b986f-f5da-4458-93bc-c093c0f8a24b" (UID: "f04b986f-f5da-4458-93bc-c093c0f8a24b"). InnerVolumeSpecName "kube-api-access-xwgkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.263734 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6590feb6-7a54-4dcb-8656-60f55d65a5f2" (UID: "6590feb6-7a54-4dcb-8656-60f55d65a5f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.298897 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data" (OuterVolumeSpecName: "config-data") pod "6590feb6-7a54-4dcb-8656-60f55d65a5f2" (UID: "6590feb6-7a54-4dcb-8656-60f55d65a5f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.329435 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.329488 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klgtp\" (UniqueName: \"kubernetes.io/projected/6590feb6-7a54-4dcb-8656-60f55d65a5f2-kube-api-access-klgtp\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.329501 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwgkg\" (UniqueName: \"kubernetes.io/projected/f04b986f-f5da-4458-93bc-c093c0f8a24b-kube-api-access-xwgkg\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.329510 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6590feb6-7a54-4dcb-8656-60f55d65a5f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.972161 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6590feb6-7a54-4dcb-8656-60f55d65a5f2","Type":"ContainerDied","Data":"07d1c04095dbdf8fa8c6658f2e1460ce9d1967c1168bc8c1a6eb0c021501c658"} Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.972250 4713 scope.go:117] "RemoveContainer" containerID="4633a409ec3408915579cb9121db32d93190d9d003e26d9e8c2692fcccc85444" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.972443 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:56:14 crc kubenswrapper[4713]: I0314 05:56:14.972598 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.018606 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.041122 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.053110 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.072397 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.078574 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: E0314 05:56:15.079108 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04b986f-f5da-4458-93bc-c093c0f8a24b" containerName="kube-state-metrics" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079126 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04b986f-f5da-4458-93bc-c093c0f8a24b" containerName="kube-state-metrics" Mar 14 05:56:15 crc kubenswrapper[4713]: E0314 05:56:15.079153 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" containerName="mysqld-exporter" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079160 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" containerName="mysqld-exporter" Mar 14 05:56:15 crc kubenswrapper[4713]: E0314 05:56:15.079182 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" containerName="oc" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079188 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" containerName="oc" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079402 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" containerName="mysqld-exporter" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079438 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04b986f-f5da-4458-93bc-c093c0f8a24b" containerName="kube-state-metrics" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.079454 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" containerName="oc" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.080316 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.082562 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.084186 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.089326 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.113144 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.115150 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.115284 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.117410 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.117637 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.145297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkf2\" (UniqueName: \"kubernetes.io/projected/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-api-access-skkf2\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.145464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.145663 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.146154 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.248593 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.248689 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.248755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psq9f\" (UniqueName: \"kubernetes.io/projected/80149b83-1a15-44d7-be14-8bd3ae881e7e-kube-api-access-psq9f\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.248779 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-config-data\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.248948 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.249084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.249166 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkf2\" (UniqueName: \"kubernetes.io/projected/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-api-access-skkf2\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.249191 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.255621 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.256052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.258470 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0064af5-2496-4bcb-89b2-f9446d023d2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.271043 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkf2\" (UniqueName: \"kubernetes.io/projected/e0064af5-2496-4bcb-89b2-f9446d023d2b-kube-api-access-skkf2\") pod \"kube-state-metrics-0\" (UID: \"e0064af5-2496-4bcb-89b2-f9446d023d2b\") " pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.330289 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.330633 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-central-agent" containerID="cri-o://ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97" gracePeriod=30 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.330663 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="proxy-httpd" containerID="cri-o://46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1" gracePeriod=30 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.330667 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="sg-core" containerID="cri-o://dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925" gracePeriod=30 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.330733 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-notification-agent" containerID="cri-o://438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c" gracePeriod=30 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.352676 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-config-data\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.352771 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.352867 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.352993 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psq9f\" (UniqueName: \"kubernetes.io/projected/80149b83-1a15-44d7-be14-8bd3ae881e7e-kube-api-access-psq9f\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.356723 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-config-data\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.357866 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.357894 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/80149b83-1a15-44d7-be14-8bd3ae881e7e-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.381161 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psq9f\" (UniqueName: \"kubernetes.io/projected/80149b83-1a15-44d7-be14-8bd3ae881e7e-kube-api-access-psq9f\") pod \"mysqld-exporter-0\" (UID: \"80149b83-1a15-44d7-be14-8bd3ae881e7e\") " pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.399580 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.434121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.597756 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6590feb6-7a54-4dcb-8656-60f55d65a5f2" path="/var/lib/kubelet/pods/6590feb6-7a54-4dcb-8656-60f55d65a5f2/volumes" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.600137 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04b986f-f5da-4458-93bc-c093c0f8a24b" path="/var/lib/kubelet/pods/f04b986f-f5da-4458-93bc-c093c0f8a24b/volumes" Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.917722 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 05:56:15 crc kubenswrapper[4713]: W0314 05:56:15.920092 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0064af5_2496_4bcb_89b2_f9446d023d2b.slice/crio-65c6ffddfbd19223fddb47779ccae27814ee0950d4440155ff0d5da8ecc41c21 WatchSource:0}: Error finding container 65c6ffddfbd19223fddb47779ccae27814ee0950d4440155ff0d5da8ecc41c21: Status 404 returned error can't find the container with id 65c6ffddfbd19223fddb47779ccae27814ee0950d4440155ff0d5da8ecc41c21 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996530 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cb61232-d986-4a6e-a52d-26ff98998087" containerID="46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1" exitCode=0 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996591 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cb61232-d986-4a6e-a52d-26ff98998087" containerID="dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925" exitCode=2 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996607 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cb61232-d986-4a6e-a52d-26ff98998087" containerID="ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97" exitCode=0 Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996686 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerDied","Data":"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1"} Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerDied","Data":"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925"} Mar 14 05:56:15 crc kubenswrapper[4713]: I0314 05:56:15.996741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerDied","Data":"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97"} Mar 14 05:56:16 crc kubenswrapper[4713]: I0314 05:56:16.001632 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0064af5-2496-4bcb-89b2-f9446d023d2b","Type":"ContainerStarted","Data":"65c6ffddfbd19223fddb47779ccae27814ee0950d4440155ff0d5da8ecc41c21"} Mar 14 05:56:16 crc kubenswrapper[4713]: W0314 05:56:16.063478 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80149b83_1a15_44d7_be14_8bd3ae881e7e.slice/crio-d344a538a5e39fda8cbf983898650ad6ff92ecadd61f3992dbb20d578f80069e WatchSource:0}: Error finding container d344a538a5e39fda8cbf983898650ad6ff92ecadd61f3992dbb20d578f80069e: Status 404 returned error can't find the container with id d344a538a5e39fda8cbf983898650ad6ff92ecadd61f3992dbb20d578f80069e Mar 14 05:56:16 crc kubenswrapper[4713]: I0314 05:56:16.067047 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.014145 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"80149b83-1a15-44d7-be14-8bd3ae881e7e","Type":"ContainerStarted","Data":"b89e026ffb84c4893b770f733b4405432e0e4cc2ab4a671fd390c2753d18e1fa"} Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.014779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"80149b83-1a15-44d7-be14-8bd3ae881e7e","Type":"ContainerStarted","Data":"d344a538a5e39fda8cbf983898650ad6ff92ecadd61f3992dbb20d578f80069e"} Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.015940 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0064af5-2496-4bcb-89b2-f9446d023d2b","Type":"ContainerStarted","Data":"87e25606ac8329be4553e71f2752675f18f992730408804afc5546c74da3b354"} Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.016121 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.043869 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.512830525 podStartE2EDuration="2.043841767s" podCreationTimestamp="2026-03-14 05:56:15 +0000 UTC" firstStartedPulling="2026-03-14 05:56:16.065931804 +0000 UTC m=+1759.153841104" lastFinishedPulling="2026-03-14 05:56:16.596943046 +0000 UTC m=+1759.684852346" observedRunningTime="2026-03-14 05:56:17.028491786 +0000 UTC m=+1760.116401096" watchObservedRunningTime="2026-03-14 05:56:17.043841767 +0000 UTC m=+1760.131751067" Mar 14 05:56:17 crc kubenswrapper[4713]: I0314 05:56:17.062670 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.679427053 podStartE2EDuration="2.062646134s" podCreationTimestamp="2026-03-14 05:56:15 +0000 UTC" firstStartedPulling="2026-03-14 05:56:15.922795705 +0000 UTC m=+1759.010705005" lastFinishedPulling="2026-03-14 05:56:16.306014786 +0000 UTC m=+1759.393924086" observedRunningTime="2026-03-14 05:56:17.055385311 +0000 UTC m=+1760.143294611" watchObservedRunningTime="2026-03-14 05:56:17.062646134 +0000 UTC m=+1760.150555434" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.564361 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:56:18 crc kubenswrapper[4713]: E0314 05:56:18.565265 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.622592 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661208 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661459 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wgtx\" (UniqueName: \"kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661572 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661683 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661721 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661763 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.661844 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml\") pod \"0cb61232-d986-4a6e-a52d-26ff98998087\" (UID: \"0cb61232-d986-4a6e-a52d-26ff98998087\") " Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.664327 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.672537 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.673386 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx" (OuterVolumeSpecName: "kube-api-access-8wgtx") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "kube-api-access-8wgtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.683879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts" (OuterVolumeSpecName: "scripts") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.719894 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.765614 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wgtx\" (UniqueName: \"kubernetes.io/projected/0cb61232-d986-4a6e-a52d-26ff98998087-kube-api-access-8wgtx\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.765658 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.765710 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.765722 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.765734 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cb61232-d986-4a6e-a52d-26ff98998087-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.794668 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.810124 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data" (OuterVolumeSpecName: "config-data") pod "0cb61232-d986-4a6e-a52d-26ff98998087" (UID: "0cb61232-d986-4a6e-a52d-26ff98998087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.882578 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:18 crc kubenswrapper[4713]: I0314 05:56:18.882907 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb61232-d986-4a6e-a52d-26ff98998087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.040043 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cb61232-d986-4a6e-a52d-26ff98998087" containerID="438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c" exitCode=0 Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.040091 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerDied","Data":"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c"} Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.040119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cb61232-d986-4a6e-a52d-26ff98998087","Type":"ContainerDied","Data":"611f585cb2d626fd143170c10a557cd3eca85f1d91ffc9fd9a290dcc1258a551"} Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.040135 4713 scope.go:117] "RemoveContainer" containerID="46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.040316 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.068738 4713 scope.go:117] "RemoveContainer" containerID="dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.081963 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.098721 4713 scope.go:117] "RemoveContainer" containerID="438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.113145 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.127723 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.128287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-central-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128303 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-central-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.128331 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="proxy-httpd" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128338 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="proxy-httpd" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.128358 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-notification-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128364 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-notification-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.128376 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="sg-core" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128381 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="sg-core" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128623 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-notification-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128636 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="proxy-httpd" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128650 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="sg-core" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.128658 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" containerName="ceilometer-central-agent" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.131550 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.134945 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.135905 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.136092 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.139111 4713 scope.go:117] "RemoveContainer" containerID="ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.144192 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.188789 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.188879 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.188906 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.188922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.189006 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.189065 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8x42\" (UniqueName: \"kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.189104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.189134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.210683 4713 scope.go:117] "RemoveContainer" containerID="46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.217387 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1\": container with ID starting with 46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1 not found: ID does not exist" containerID="46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.217444 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1"} err="failed to get container status \"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1\": rpc error: code = NotFound desc = could not find container \"46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1\": container with ID starting with 46f57bef2fa48a0a6ff981292473bdcafb4b82ce089f8966ba2fab51458826a1 not found: ID does not exist" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.217480 4713 scope.go:117] "RemoveContainer" containerID="dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.217799 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925\": container with ID starting with dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925 not found: ID does not exist" containerID="dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.217826 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925"} err="failed to get container status \"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925\": rpc error: code = NotFound desc = could not find container \"dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925\": container with ID starting with dff2dfb52221442e2dd244116d84a9d7fb71005c288939330efddb13337c3925 not found: ID does not exist" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.217846 4713 scope.go:117] "RemoveContainer" containerID="438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.218129 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c\": container with ID starting with 438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c not found: ID does not exist" containerID="438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.218156 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c"} err="failed to get container status \"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c\": rpc error: code = NotFound desc = could not find container \"438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c\": container with ID starting with 438e25e0b3ee31f3597397357ca386086f777502f83aacf1da33d8e07d9b342c not found: ID does not exist" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.218172 4713 scope.go:117] "RemoveContainer" containerID="ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97" Mar 14 05:56:19 crc kubenswrapper[4713]: E0314 05:56:19.218466 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97\": container with ID starting with ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97 not found: ID does not exist" containerID="ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.218491 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97"} err="failed to get container status \"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97\": rpc error: code = NotFound desc = could not find container \"ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97\": container with ID starting with ee53bde6fcf55ce821fa5e5a0ee6db6a19d56c2235728a246c2f1364543a8a97 not found: ID does not exist" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.290625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8x42\" (UniqueName: \"kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.290973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291073 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291125 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291146 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291162 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.291262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.292140 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.292543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.295296 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.295472 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.296845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.296938 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.301699 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.315370 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8x42\" (UniqueName: \"kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42\") pod \"ceilometer-0\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.481382 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.583119 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb61232-d986-4a6e-a52d-26ff98998087" path="/var/lib/kubelet/pods/0cb61232-d986-4a6e-a52d-26ff98998087/volumes" Mar 14 05:56:19 crc kubenswrapper[4713]: I0314 05:56:19.968908 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:56:19 crc kubenswrapper[4713]: W0314 05:56:19.970923 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a4a031b_b3df_4ee3_beca_1de9f529f69a.slice/crio-a9231ecc3c0a1c8610344075f47c6ac8f8598fab106b4dacfc9e36709851aa7a WatchSource:0}: Error finding container a9231ecc3c0a1c8610344075f47c6ac8f8598fab106b4dacfc9e36709851aa7a: Status 404 returned error can't find the container with id a9231ecc3c0a1c8610344075f47c6ac8f8598fab106b4dacfc9e36709851aa7a Mar 14 05:56:20 crc kubenswrapper[4713]: I0314 05:56:20.050657 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerStarted","Data":"a9231ecc3c0a1c8610344075f47c6ac8f8598fab106b4dacfc9e36709851aa7a"} Mar 14 05:56:21 crc kubenswrapper[4713]: I0314 05:56:21.066408 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerStarted","Data":"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1"} Mar 14 05:56:22 crc kubenswrapper[4713]: I0314 05:56:22.082577 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerStarted","Data":"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b"} Mar 14 05:56:23 crc kubenswrapper[4713]: I0314 05:56:23.098137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerStarted","Data":"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf"} Mar 14 05:56:24 crc kubenswrapper[4713]: I0314 05:56:24.129178 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerStarted","Data":"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5"} Mar 14 05:56:24 crc kubenswrapper[4713]: I0314 05:56:24.129579 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:56:24 crc kubenswrapper[4713]: I0314 05:56:24.157410 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.499890071 podStartE2EDuration="5.15738686s" podCreationTimestamp="2026-03-14 05:56:19 +0000 UTC" firstStartedPulling="2026-03-14 05:56:19.973365028 +0000 UTC m=+1763.061274328" lastFinishedPulling="2026-03-14 05:56:23.630861817 +0000 UTC m=+1766.718771117" observedRunningTime="2026-03-14 05:56:24.155577155 +0000 UTC m=+1767.243486475" watchObservedRunningTime="2026-03-14 05:56:24.15738686 +0000 UTC m=+1767.245296160" Mar 14 05:56:25 crc kubenswrapper[4713]: I0314 05:56:25.420629 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 05:56:30 crc kubenswrapper[4713]: I0314 05:56:30.578232 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:56:30 crc kubenswrapper[4713]: E0314 05:56:30.579648 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:56:42 crc kubenswrapper[4713]: I0314 05:56:42.563525 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:56:42 crc kubenswrapper[4713]: E0314 05:56:42.564695 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:56:45 crc kubenswrapper[4713]: I0314 05:56:45.452156 4713 scope.go:117] "RemoveContainer" containerID="d72529fc4cdf24bdd4a9ebde5d7efc3d516026012343812cdcb517c23427bba3" Mar 14 05:56:45 crc kubenswrapper[4713]: I0314 05:56:45.499514 4713 scope.go:117] "RemoveContainer" containerID="29c3fe2cc97c8cd4221c05d4eece2a38d0a377297076d6d859c628e0f71b4bec" Mar 14 05:56:49 crc kubenswrapper[4713]: I0314 05:56:49.494060 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 05:56:55 crc kubenswrapper[4713]: I0314 05:56:55.565277 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:56:55 crc kubenswrapper[4713]: E0314 05:56:55.566358 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.580834 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-xrnbs"] Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.594842 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-xrnbs"] Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.686953 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-cv4bn"] Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.688824 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.721692 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cv4bn"] Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.820667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.820854 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2rl\" (UniqueName: \"kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.821343 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.924140 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2rl\" (UniqueName: \"kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.924311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.924363 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.931357 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.935226 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:00 crc kubenswrapper[4713]: I0314 05:57:00.946989 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2rl\" (UniqueName: \"kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl\") pod \"heat-db-sync-cv4bn\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:01 crc kubenswrapper[4713]: I0314 05:57:01.011416 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:01 crc kubenswrapper[4713]: I0314 05:57:01.500984 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cv4bn"] Mar 14 05:57:01 crc kubenswrapper[4713]: I0314 05:57:01.512884 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 05:57:01 crc kubenswrapper[4713]: I0314 05:57:01.556271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cv4bn" event={"ID":"0230bf57-ad22-415a-bc91-2269773f8097","Type":"ContainerStarted","Data":"806ab68e54b900241edc1e4665286a508c665c4f7b5ba652e7c5a989bf3f9186"} Mar 14 05:57:01 crc kubenswrapper[4713]: I0314 05:57:01.580673 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9b887e-a476-4d85-8fc0-695678cee457" path="/var/lib/kubelet/pods/3f9b887e-a476-4d85-8fc0-695678cee457/volumes" Mar 14 05:57:02 crc kubenswrapper[4713]: I0314 05:57:02.713871 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.275828 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.276164 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-central-agent" containerID="cri-o://790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1" gracePeriod=30 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.276332 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="proxy-httpd" containerID="cri-o://7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5" gracePeriod=30 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.276381 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="sg-core" containerID="cri-o://50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf" gracePeriod=30 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.276417 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-notification-agent" containerID="cri-o://40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b" gracePeriod=30 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.636560 4713 generic.go:334] "Generic (PLEG): container finished" podID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerID="7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5" exitCode=0 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.636600 4713 generic.go:334] "Generic (PLEG): container finished" podID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerID="50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf" exitCode=2 Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.636623 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerDied","Data":"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5"} Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.636649 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerDied","Data":"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf"} Mar 14 05:57:03 crc kubenswrapper[4713]: I0314 05:57:03.883922 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:04 crc kubenswrapper[4713]: I0314 05:57:04.680008 4713 generic.go:334] "Generic (PLEG): container finished" podID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerID="790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1" exitCode=0 Mar 14 05:57:04 crc kubenswrapper[4713]: I0314 05:57:04.680367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerDied","Data":"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1"} Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.498527 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648576 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648698 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648820 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.648988 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.649007 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8x42\" (UniqueName: \"kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.649048 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd\") pod \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\" (UID: \"7a4a031b-b3df-4ee3-beca-1de9f529f69a\") " Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.650236 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.658851 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.672085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts" (OuterVolumeSpecName: "scripts") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.682503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42" (OuterVolumeSpecName: "kube-api-access-r8x42") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "kube-api-access-r8x42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.744246 4713 generic.go:334] "Generic (PLEG): container finished" podID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerID="40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b" exitCode=0 Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.744472 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerDied","Data":"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b"} Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.745016 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a4a031b-b3df-4ee3-beca-1de9f529f69a","Type":"ContainerDied","Data":"a9231ecc3c0a1c8610344075f47c6ac8f8598fab106b4dacfc9e36709851aa7a"} Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.744746 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.745066 4713 scope.go:117] "RemoveContainer" containerID="7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.751944 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.752652 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8x42\" (UniqueName: \"kubernetes.io/projected/7a4a031b-b3df-4ee3-beca-1de9f529f69a-kube-api-access-r8x42\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.752693 4713 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.752705 4713 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a4a031b-b3df-4ee3-beca-1de9f529f69a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.797227 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.831235 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.845836 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.855423 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.855453 4713 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.855462 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:05 crc kubenswrapper[4713]: I0314 05:57:05.962679 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data" (OuterVolumeSpecName: "config-data") pod "7a4a031b-b3df-4ee3-beca-1de9f529f69a" (UID: "7a4a031b-b3df-4ee3-beca-1de9f529f69a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.061381 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4a031b-b3df-4ee3-beca-1de9f529f69a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.091934 4713 scope.go:117] "RemoveContainer" containerID="50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.121817 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.143239 4713 scope.go:117] "RemoveContainer" containerID="40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.146758 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.214331 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.215972 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-central-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.216100 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-central-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.216229 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="sg-core" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.216313 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="sg-core" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.216422 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-notification-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.216508 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-notification-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.216653 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="proxy-httpd" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.216742 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="proxy-httpd" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.217482 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="sg-core" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.217651 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="proxy-httpd" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.217760 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-central-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.217938 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" containerName="ceilometer-notification-agent" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.224502 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.231766 4713 scope.go:117] "RemoveContainer" containerID="790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.232125 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.232179 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.232225 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.235689 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.286530 4713 scope.go:117] "RemoveContainer" containerID="7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.287946 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5\": container with ID starting with 7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5 not found: ID does not exist" containerID="7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.287992 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5"} err="failed to get container status \"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5\": rpc error: code = NotFound desc = could not find container \"7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5\": container with ID starting with 7929d2ad0cba18ef742069cb93efd03024003e22186bc31592828a344cb5abd5 not found: ID does not exist" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.288024 4713 scope.go:117] "RemoveContainer" containerID="50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.288653 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf\": container with ID starting with 50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf not found: ID does not exist" containerID="50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.288714 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf"} err="failed to get container status \"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf\": rpc error: code = NotFound desc = could not find container \"50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf\": container with ID starting with 50ed71e3fed91d7e52a70bd69eeac5d449878a46fe2c6e647ecdb392ededc4bf not found: ID does not exist" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.288753 4713 scope.go:117] "RemoveContainer" containerID="40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.292021 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b\": container with ID starting with 40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b not found: ID does not exist" containerID="40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.292156 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b"} err="failed to get container status \"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b\": rpc error: code = NotFound desc = could not find container \"40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b\": container with ID starting with 40188088275eee077fc8433b46603847597119ebe1a8f1f8928b51c4309e2d1b not found: ID does not exist" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.292299 4713 scope.go:117] "RemoveContainer" containerID="790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1" Mar 14 05:57:06 crc kubenswrapper[4713]: E0314 05:57:06.296560 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1\": container with ID starting with 790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1 not found: ID does not exist" containerID="790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.296738 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1"} err="failed to get container status \"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1\": rpc error: code = NotFound desc = could not find container \"790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1\": container with ID starting with 790ab204a723c820584bd626cb4934f4610473567f18540232a3839a375902a1 not found: ID does not exist" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.386624 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.386900 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvfph\" (UniqueName: \"kubernetes.io/projected/c2c79b18-2189-46d9-bbd4-55f58870d723-kube-api-access-mvfph\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.386983 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.387026 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.387283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-config-data\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.387434 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-scripts\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.387632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.387680 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490435 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490518 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490656 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvfph\" (UniqueName: \"kubernetes.io/projected/c2c79b18-2189-46d9-bbd4-55f58870d723-kube-api-access-mvfph\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490691 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-config-data\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.490844 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-scripts\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.492005 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-run-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.492493 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2c79b18-2189-46d9-bbd4-55f58870d723-log-httpd\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.495831 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-scripts\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.496682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-config-data\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.496879 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.497617 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.512542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c79b18-2189-46d9-bbd4-55f58870d723-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.515860 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvfph\" (UniqueName: \"kubernetes.io/projected/c2c79b18-2189-46d9-bbd4-55f58870d723-kube-api-access-mvfph\") pod \"ceilometer-0\" (UID: \"c2c79b18-2189-46d9-bbd4-55f58870d723\") " pod="openstack/ceilometer-0" Mar 14 05:57:06 crc kubenswrapper[4713]: I0314 05:57:06.565346 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 05:57:07 crc kubenswrapper[4713]: I0314 05:57:07.256263 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 05:57:07 crc kubenswrapper[4713]: I0314 05:57:07.566153 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:57:07 crc kubenswrapper[4713]: E0314 05:57:07.566756 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:57:07 crc kubenswrapper[4713]: I0314 05:57:07.608537 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4a031b-b3df-4ee3-beca-1de9f529f69a" path="/var/lib/kubelet/pods/7a4a031b-b3df-4ee3-beca-1de9f529f69a/volumes" Mar 14 05:57:07 crc kubenswrapper[4713]: I0314 05:57:07.788585 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"c052d5e5fa94aef209c88ec8a154eb9c0ca0b14bc8eac03e1a4b6a92c9506900"} Mar 14 05:57:08 crc kubenswrapper[4713]: I0314 05:57:08.504309 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" containerID="cri-o://2d261e1bd6c66bf317e7962278ac94bd87c70e5cdbe5c85550fac6df5c0a208b" gracePeriod=604795 Mar 14 05:57:09 crc kubenswrapper[4713]: I0314 05:57:09.548893 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="rabbitmq" containerID="cri-o://b19fb7a7667d09e3c59d54fcbe249af2fd6a6fdc99fde2d16643240caf47febe" gracePeriod=604795 Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.586980 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.727373 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.952658 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerID="b19fb7a7667d09e3c59d54fcbe249af2fd6a6fdc99fde2d16643240caf47febe" exitCode=0 Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.952742 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerDied","Data":"b19fb7a7667d09e3c59d54fcbe249af2fd6a6fdc99fde2d16643240caf47febe"} Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.957463 4713 generic.go:334] "Generic (PLEG): container finished" podID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerID="2d261e1bd6c66bf317e7962278ac94bd87c70e5cdbe5c85550fac6df5c0a208b" exitCode=0 Mar 14 05:57:15 crc kubenswrapper[4713]: I0314 05:57:15.957505 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerDied","Data":"2d261e1bd6c66bf317e7962278ac94bd87c70e5cdbe5c85550fac6df5c0a208b"} Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.431133 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541547 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541656 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w52g\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541731 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541757 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541836 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541908 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541940 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.541986 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.542025 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.543163 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.543232 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd\") pod \"145c4018-82f1-49b5-9d3b-15c97c299a4a\" (UID: \"145c4018-82f1-49b5-9d3b-15c97c299a4a\") " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.546370 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.552463 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.555414 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.557849 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.563981 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.564057 4713 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.580299 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info" (OuterVolumeSpecName: "pod-info") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.580827 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.586044 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.617568 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g" (OuterVolumeSpecName: "kube-api-access-7w52g") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "kube-api-access-7w52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.668588 4713 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/145c4018-82f1-49b5-9d3b-15c97c299a4a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.671774 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w52g\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-kube-api-access-7w52g\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.672049 4713 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/145c4018-82f1-49b5-9d3b-15c97c299a4a-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.672245 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.677503 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296" (OuterVolumeSpecName: "persistence") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "pvc-6481649c-1830-4401-ac86-2f626a48c296". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.711694 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data" (OuterVolumeSpecName: "config-data") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.737623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf" (OuterVolumeSpecName: "server-conf") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.776161 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") on node \"crc\" " Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.776230 4713 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.776250 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/145c4018-82f1-49b5-9d3b-15c97c299a4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.821526 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.825587 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6481649c-1830-4401-ac86-2f626a48c296" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296") on node "crc" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.838464 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "145c4018-82f1-49b5-9d3b-15c97c299a4a" (UID: "145c4018-82f1-49b5-9d3b-15c97c299a4a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.879443 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:19 crc kubenswrapper[4713]: I0314 05:57:19.879481 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/145c4018-82f1-49b5-9d3b-15c97c299a4a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.018302 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"145c4018-82f1-49b5-9d3b-15c97c299a4a","Type":"ContainerDied","Data":"79d4b290a60238e60d2f1acec0af0949e417f1dfe7b520d5f2570d13f9b0e7fd"} Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.018373 4713 scope.go:117] "RemoveContainer" containerID="2d261e1bd6c66bf317e7962278ac94bd87c70e5cdbe5c85550fac6df5c0a208b" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.018420 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.068516 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.103930 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.125339 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:20 crc kubenswrapper[4713]: E0314 05:57:20.126103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="setup-container" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.126125 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="setup-container" Mar 14 05:57:20 crc kubenswrapper[4713]: E0314 05:57:20.126159 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.126168 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.126522 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" containerName="rabbitmq" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.128369 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.136666 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192078 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192312 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192378 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-config-data\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192423 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192502 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b7136fb-37b4-4b12-a917-37f2a708eedd-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192723 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192742 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjthz\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-kube-api-access-xjthz\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192782 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.192818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b7136fb-37b4-4b12-a917-37f2a708eedd-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.294843 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.294934 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjthz\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-kube-api-access-xjthz\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.294965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.294991 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b7136fb-37b4-4b12-a917-37f2a708eedd-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295049 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295086 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295123 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295264 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-config-data\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295385 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.295434 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b7136fb-37b4-4b12-a917-37f2a708eedd-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.297124 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.300640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.300982 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.301072 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.301101 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81180cff6c7d54037a275b5eec5606984e9d96ef07a76c39a39ebf70b61a2c81/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.301332 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.301819 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-config-data\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.301972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b7136fb-37b4-4b12-a917-37f2a708eedd-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.303197 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b7136fb-37b4-4b12-a917-37f2a708eedd-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.305735 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b7136fb-37b4-4b12-a917-37f2a708eedd-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.306991 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.316364 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjthz\" (UniqueName: \"kubernetes.io/projected/8b7136fb-37b4-4b12-a917-37f2a708eedd-kube-api-access-xjthz\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.394760 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6481649c-1830-4401-ac86-2f626a48c296\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6481649c-1830-4401-ac86-2f626a48c296\") pod \"rabbitmq-server-2\" (UID: \"8b7136fb-37b4-4b12-a917-37f2a708eedd\") " pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.464410 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 14 05:57:20 crc kubenswrapper[4713]: I0314 05:57:20.565682 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:57:20 crc kubenswrapper[4713]: E0314 05:57:20.565975 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.080410 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.082438 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.088622 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114583 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114616 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8n7\" (UniqueName: \"kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114690 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114735 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114822 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.114902 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.137274 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.216757 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.216865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.216941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.216974 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.216996 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8n7\" (UniqueName: \"kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.217048 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.217083 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.217930 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.218532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.232499 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.235274 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.242105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.249249 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.275843 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8n7\" (UniqueName: \"kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7\") pod \"dnsmasq-dns-5b75489c6f-w5frw\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.364746 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.366061 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.391523 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-bhxpb"] Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.394220 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.412249 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-bhxpb"] Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422181 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422268 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422372 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422420 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-config\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422521 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.422565 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlpf5\" (UniqueName: \"kubernetes.io/projected/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-kube-api-access-zlpf5\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.524897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-config\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.524986 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.525067 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.525116 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlpf5\" (UniqueName: \"kubernetes.io/projected/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-kube-api-access-zlpf5\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.525174 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.525877 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.526030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.527095 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.527271 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.527399 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-config\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.528141 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.528329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.528342 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.549816 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlpf5\" (UniqueName: \"kubernetes.io/projected/4c2880bc-ddaa-44ac-81f5-05e29a7c05d0-kube-api-access-zlpf5\") pod \"dnsmasq-dns-5d75f767dc-bhxpb\" (UID: \"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0\") " pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.589000 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145c4018-82f1-49b5-9d3b-15c97c299a4a" path="/var/lib/kubelet/pods/145c4018-82f1-49b5-9d3b-15c97c299a4a/volumes" Mar 14 05:57:21 crc kubenswrapper[4713]: I0314 05:57:21.777815 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.050429 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.087828 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca9b4055-d903-491e-bbf8-4777d51a1af8","Type":"ContainerDied","Data":"b4f34d1e9cab71f50b8ab398225afc470fea2842bba9479d691c4400fa71f068"} Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.087952 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.137441 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d7rh\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.137571 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139039 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139151 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139186 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139310 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139391 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139441 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.139504 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie\") pod \"ca9b4055-d903-491e-bbf8-4777d51a1af8\" (UID: \"ca9b4055-d903-491e-bbf8-4777d51a1af8\") " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.143778 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.144736 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.147510 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.167464 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.168810 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info" (OuterVolumeSpecName: "pod-info") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.168778 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.169175 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh" (OuterVolumeSpecName: "kube-api-access-5d7rh") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "kube-api-access-5d7rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.205712 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data" (OuterVolumeSpecName: "config-data") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.219254 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3" (OuterVolumeSpecName: "persistence") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244323 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d7rh\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-kube-api-access-5d7rh\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244387 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") on node \"crc\" " Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244406 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244423 4713 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244435 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244446 4713 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca9b4055-d903-491e-bbf8-4777d51a1af8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244457 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244467 4713 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca9b4055-d903-491e-bbf8-4777d51a1af8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.244477 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.292917 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf" (OuterVolumeSpecName: "server-conf") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.298610 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.298762 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3") on node "crc" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.347490 4713 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca9b4055-d903-491e-bbf8-4777d51a1af8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.347544 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.371110 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ca9b4055-d903-491e-bbf8-4777d51a1af8" (UID: "ca9b4055-d903-491e-bbf8-4777d51a1af8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.439761 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.450435 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca9b4055-d903-491e-bbf8-4777d51a1af8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.460144 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.485294 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.486017 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="rabbitmq" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.486037 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="rabbitmq" Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.486061 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="setup-container" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.486071 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="setup-container" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.486393 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" containerName="rabbitmq" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.488124 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.490127 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.494775 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.494874 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.495012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.494877 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.495135 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.495158 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6t2qm" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.500765 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560174 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560246 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560269 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560322 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560431 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560454 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdzb\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-kube-api-access-2mdzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560532 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560600 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.560665 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.584074 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9b4055-d903-491e-bbf8-4777d51a1af8" path="/var/lib/kubelet/pods/ca9b4055-d903-491e-bbf8-4777d51a1af8/volumes" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.663223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.663615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.663821 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664377 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664530 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664592 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664726 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664787 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.664973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.665394 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdzb\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-kube-api-access-2mdzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.665902 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.666602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.667407 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.667814 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.668167 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.668218 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d9e5f92b3791c958b2470b172a1fa1218f1b4f5595f643eb3c99247f0499ec82/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.669635 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.670976 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.671739 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.672627 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.687395 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdzb\" (UniqueName: \"kubernetes.io/projected/b83bd95f-ad77-4c7a-9e24-5d2320c7823d-kube-api-access-2mdzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.723111 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.723186 4713 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.723351 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mg2rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-cv4bn_openstack(0230bf57-ad22-415a-bc91-2269773f8097): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:57:25 crc kubenswrapper[4713]: E0314 05:57:25.724757 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-cv4bn" podUID="0230bf57-ad22-415a-bc91-2269773f8097" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.754383 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cd5a26b-b9a4-4432-b9cb-d827ffa134d3\") pod \"rabbitmq-cell1-server-0\" (UID: \"b83bd95f-ad77-4c7a-9e24-5d2320c7823d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:25 crc kubenswrapper[4713]: I0314 05:57:25.814102 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:57:26 crc kubenswrapper[4713]: E0314 05:57:26.099171 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-cv4bn" podUID="0230bf57-ad22-415a-bc91-2269773f8097" Mar 14 05:57:27 crc kubenswrapper[4713]: I0314 05:57:27.904710 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:28 crc kubenswrapper[4713]: I0314 05:57:28.253238 4713 scope.go:117] "RemoveContainer" containerID="aa4a5ceb1b002ee0c5369d9f8a6166c1e3452680788305e7434abe62eb526493" Mar 14 05:57:28 crc kubenswrapper[4713]: E0314 05:57:28.283695 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 14 05:57:28 crc kubenswrapper[4713]: E0314 05:57:28.283752 4713 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 14 05:57:28 crc kubenswrapper[4713]: E0314 05:57:28.283886 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h569h56h5cch687h694hc6h695h56dh5c4hb7hd5h54ch59ch7dh5bfh695h5c8h99h65chbh96h56bh7h677h65fh675h79h65h644h577h659q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvfph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c2c79b18-2189-46d9-bbd4-55f58870d723): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 05:57:28 crc kubenswrapper[4713]: I0314 05:57:28.938576 4713 scope.go:117] "RemoveContainer" containerID="b19fb7a7667d09e3c59d54fcbe249af2fd6a6fdc99fde2d16643240caf47febe" Mar 14 05:57:29 crc kubenswrapper[4713]: I0314 05:57:29.001970 4713 scope.go:117] "RemoveContainer" containerID="757454825f7b8736377c878e350329ecf32a743829e7db7e77afe631eff684e5" Mar 14 05:57:29 crc kubenswrapper[4713]: I0314 05:57:29.154600 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" event={"ID":"c83c38a2-e486-41f6-b2db-5be23fbcd94a","Type":"ContainerStarted","Data":"378ca7014b902a5adae3cb3bb3df5d52289ac71d9e4fbbe9e19fb377921a1e14"} Mar 14 05:57:29 crc kubenswrapper[4713]: W0314 05:57:29.164119 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7136fb_37b4_4b12_a917_37f2a708eedd.slice/crio-1fa59eb4dc5d6cdb2c1fe775a46e546dfbc04ad6c8d90e6e818d30cb88f7b068 WatchSource:0}: Error finding container 1fa59eb4dc5d6cdb2c1fe775a46e546dfbc04ad6c8d90e6e818d30cb88f7b068: Status 404 returned error can't find the container with id 1fa59eb4dc5d6cdb2c1fe775a46e546dfbc04ad6c8d90e6e818d30cb88f7b068 Mar 14 05:57:29 crc kubenswrapper[4713]: I0314 05:57:29.169028 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 14 05:57:29 crc kubenswrapper[4713]: I0314 05:57:29.311463 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 05:57:29 crc kubenswrapper[4713]: W0314 05:57:29.318875 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c2880bc_ddaa_44ac_81f5_05e29a7c05d0.slice/crio-caae272c8df6e2983c45383211e545d7b34cebdebb2b2b35cf269b0b4314e240 WatchSource:0}: Error finding container caae272c8df6e2983c45383211e545d7b34cebdebb2b2b35cf269b0b4314e240: Status 404 returned error can't find the container with id caae272c8df6e2983c45383211e545d7b34cebdebb2b2b35cf269b0b4314e240 Mar 14 05:57:29 crc kubenswrapper[4713]: I0314 05:57:29.328230 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-bhxpb"] Mar 14 05:57:30 crc kubenswrapper[4713]: I0314 05:57:30.169425 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b83bd95f-ad77-4c7a-9e24-5d2320c7823d","Type":"ContainerStarted","Data":"25424ed046d408dbd4d83aa60813d2aaeed8a477cd82e153514bfb639c8d02fd"} Mar 14 05:57:30 crc kubenswrapper[4713]: I0314 05:57:30.170646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8b7136fb-37b4-4b12-a917-37f2a708eedd","Type":"ContainerStarted","Data":"1fa59eb4dc5d6cdb2c1fe775a46e546dfbc04ad6c8d90e6e818d30cb88f7b068"} Mar 14 05:57:30 crc kubenswrapper[4713]: I0314 05:57:30.172816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" event={"ID":"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0","Type":"ContainerStarted","Data":"caae272c8df6e2983c45383211e545d7b34cebdebb2b2b35cf269b0b4314e240"} Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.187055 4713 generic.go:334] "Generic (PLEG): container finished" podID="c83c38a2-e486-41f6-b2db-5be23fbcd94a" containerID="06d72d07c256d4ff21082a4394374f21fb4138fc2fb76e2ff4028023c446945f" exitCode=0 Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.187110 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" event={"ID":"c83c38a2-e486-41f6-b2db-5be23fbcd94a","Type":"ContainerDied","Data":"06d72d07c256d4ff21082a4394374f21fb4138fc2fb76e2ff4028023c446945f"} Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.189252 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" event={"ID":"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0","Type":"ContainerDied","Data":"91f78762d42adfd5e3aa86331c8195013642f7c62c41a237c1d3c7924e1d676b"} Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.189162 4713 generic.go:334] "Generic (PLEG): container finished" podID="4c2880bc-ddaa-44ac-81f5-05e29a7c05d0" containerID="91f78762d42adfd5e3aa86331c8195013642f7c62c41a237c1d3c7924e1d676b" exitCode=0 Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.699551 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.762490 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.762753 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.762865 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr8n7\" (UniqueName: \"kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.762922 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.762958 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.763063 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.763089 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config\") pod \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\" (UID: \"c83c38a2-e486-41f6-b2db-5be23fbcd94a\") " Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.768319 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7" (OuterVolumeSpecName: "kube-api-access-gr8n7") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "kube-api-access-gr8n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.791537 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.797253 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.799773 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.800138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config" (OuterVolumeSpecName: "config") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.802160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.804499 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c83c38a2-e486-41f6-b2db-5be23fbcd94a" (UID: "c83c38a2-e486-41f6-b2db-5be23fbcd94a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865752 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865806 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865821 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865834 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865846 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865858 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c83c38a2-e486-41f6-b2db-5be23fbcd94a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:31 crc kubenswrapper[4713]: I0314 05:57:31.865871 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr8n7\" (UniqueName: \"kubernetes.io/projected/c83c38a2-e486-41f6-b2db-5be23fbcd94a-kube-api-access-gr8n7\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.210275 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8b7136fb-37b4-4b12-a917-37f2a708eedd","Type":"ContainerStarted","Data":"c15190dc66e799b776a359f2f596139fd5d0e90179036b66317cf4387a9f71f3"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.213785 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"e83fcb0dce5b36d330660df8e085a6c53ae964b9f92f319c2f4d05284ebbc8a4"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.213815 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"610883642c1313eab45b7c1976bc3f639d63b9ea5ceb73b23862171dc8e5be9a"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.215413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" event={"ID":"c83c38a2-e486-41f6-b2db-5be23fbcd94a","Type":"ContainerDied","Data":"378ca7014b902a5adae3cb3bb3df5d52289ac71d9e4fbbe9e19fb377921a1e14"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.215445 4713 scope.go:117] "RemoveContainer" containerID="06d72d07c256d4ff21082a4394374f21fb4138fc2fb76e2ff4028023c446945f" Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.215546 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-w5frw" Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.219705 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" event={"ID":"4c2880bc-ddaa-44ac-81f5-05e29a7c05d0","Type":"ContainerStarted","Data":"f718776f335f30c63907086a5dcbcb0153d64cbd369c3c534494cc3dc840252f"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.219775 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.222380 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b83bd95f-ad77-4c7a-9e24-5d2320c7823d","Type":"ContainerStarted","Data":"571493c6372524a8f2bf0c7974370b3c7c785b7c35124873e784dce3e4183757"} Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.303600 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" podStartSLOduration=11.303570811 podStartE2EDuration="11.303570811s" podCreationTimestamp="2026-03-14 05:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:57:32.290389527 +0000 UTC m=+1835.378298847" watchObservedRunningTime="2026-03-14 05:57:32.303570811 +0000 UTC m=+1835.391480121" Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.339569 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.354259 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-w5frw"] Mar 14 05:57:32 crc kubenswrapper[4713]: I0314 05:57:32.564187 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:57:32 crc kubenswrapper[4713]: E0314 05:57:32.564488 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:57:33 crc kubenswrapper[4713]: I0314 05:57:33.576911 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83c38a2-e486-41f6-b2db-5be23fbcd94a" path="/var/lib/kubelet/pods/c83c38a2-e486-41f6-b2db-5be23fbcd94a/volumes" Mar 14 05:57:33 crc kubenswrapper[4713]: E0314 05:57:33.847380 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" Mar 14 05:57:34 crc kubenswrapper[4713]: I0314 05:57:34.251510 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"2d2399c3fd6c1f82989994f13cbfd65be7e16f621422626c9d837f834c6ad42a"} Mar 14 05:57:34 crc kubenswrapper[4713]: I0314 05:57:34.251850 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 05:57:34 crc kubenswrapper[4713]: E0314 05:57:34.253623 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" Mar 14 05:57:35 crc kubenswrapper[4713]: E0314 05:57:35.268559 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" Mar 14 05:57:36 crc kubenswrapper[4713]: I0314 05:57:36.782395 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-bhxpb" Mar 14 05:57:36 crc kubenswrapper[4713]: I0314 05:57:36.907877 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:57:36 crc kubenswrapper[4713]: I0314 05:57:36.908133 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="dnsmasq-dns" containerID="cri-o://f10ec90eb7bd9f8c0cd0175132758dae5ee71b86eb3222ca1cdc231440cf8166" gracePeriod=10 Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.310968 4713 generic.go:334] "Generic (PLEG): container finished" podID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerID="f10ec90eb7bd9f8c0cd0175132758dae5ee71b86eb3222ca1cdc231440cf8166" exitCode=0 Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.311337 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" event={"ID":"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e","Type":"ContainerDied","Data":"f10ec90eb7bd9f8c0cd0175132758dae5ee71b86eb3222ca1cdc231440cf8166"} Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.573248 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722337 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722643 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722812 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722835 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.722870 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfql\" (UniqueName: \"kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql\") pod \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\" (UID: \"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e\") " Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.744567 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql" (OuterVolumeSpecName: "kube-api-access-htfql") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "kube-api-access-htfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.794734 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.796259 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.801646 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.807884 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config" (OuterVolumeSpecName: "config") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.824855 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" (UID: "8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825367 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-config\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825406 4713 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825416 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825428 4713 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825437 4713 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:37 crc kubenswrapper[4713]: I0314 05:57:37.825447 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfql\" (UniqueName: \"kubernetes.io/projected/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e-kube-api-access-htfql\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.322850 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" event={"ID":"8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e","Type":"ContainerDied","Data":"7a2c98ca80c67defd8275d4577203a6c5b125e5c64d8681d6afa62356bce254c"} Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.322909 4713 scope.go:117] "RemoveContainer" containerID="f10ec90eb7bd9f8c0cd0175132758dae5ee71b86eb3222ca1cdc231440cf8166" Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.322941 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-jsqnh" Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.352397 4713 scope.go:117] "RemoveContainer" containerID="1bf287b51fd18f4b67b48a56e3407f95ef37a899b0fbd69c7ccfcda49c92ee81" Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.360772 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:57:38 crc kubenswrapper[4713]: I0314 05:57:38.437243 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-jsqnh"] Mar 14 05:57:39 crc kubenswrapper[4713]: I0314 05:57:39.587006 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" path="/var/lib/kubelet/pods/8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e/volumes" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.364558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cv4bn" event={"ID":"0230bf57-ad22-415a-bc91-2269773f8097","Type":"ContainerStarted","Data":"a42f03a318dba23533f4c912f47f8130c181c5cc6bdaa48dc08d536336be4c8e"} Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.387721 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-cv4bn" podStartSLOduration=2.108216144 podStartE2EDuration="41.387695104s" podCreationTimestamp="2026-03-14 05:57:00 +0000 UTC" firstStartedPulling="2026-03-14 05:57:01.512579471 +0000 UTC m=+1804.600488771" lastFinishedPulling="2026-03-14 05:57:40.792058431 +0000 UTC m=+1843.879967731" observedRunningTime="2026-03-14 05:57:41.384353561 +0000 UTC m=+1844.472262861" watchObservedRunningTime="2026-03-14 05:57:41.387695104 +0000 UTC m=+1844.475604404" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.888566 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9"] Mar 14 05:57:41 crc kubenswrapper[4713]: E0314 05:57:41.889311 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="dnsmasq-dns" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.889332 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="dnsmasq-dns" Mar 14 05:57:41 crc kubenswrapper[4713]: E0314 05:57:41.889361 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="init" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.889370 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="init" Mar 14 05:57:41 crc kubenswrapper[4713]: E0314 05:57:41.889413 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c38a2-e486-41f6-b2db-5be23fbcd94a" containerName="init" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.889422 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c38a2-e486-41f6-b2db-5be23fbcd94a" containerName="init" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.889681 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce52c9b-3ba5-4dbe-9b5d-f73798da3e9e" containerName="dnsmasq-dns" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.889700 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83c38a2-e486-41f6-b2db-5be23fbcd94a" containerName="init" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.890802 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.893102 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.893653 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.897377 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.903716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 05:57:41 crc kubenswrapper[4713]: I0314 05:57:41.904409 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9"] Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.055564 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc7dd\" (UniqueName: \"kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.055661 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.055781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.055839 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.158260 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.158720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc7dd\" (UniqueName: \"kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.158835 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.158969 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.164128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.165389 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.177537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.178993 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc7dd\" (UniqueName: \"kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:42 crc kubenswrapper[4713]: I0314 05:57:42.219471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:57:43 crc kubenswrapper[4713]: I0314 05:57:43.128014 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9"] Mar 14 05:57:43 crc kubenswrapper[4713]: I0314 05:57:43.397923 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" event={"ID":"484a7a0d-8b23-4b9f-a875-843d1d9145a0","Type":"ContainerStarted","Data":"09d43f3f91c2135b3365aba155eb11d4bfb5c41fde97e1229b1b8c4447aadf54"} Mar 14 05:57:44 crc kubenswrapper[4713]: I0314 05:57:44.409886 4713 generic.go:334] "Generic (PLEG): container finished" podID="0230bf57-ad22-415a-bc91-2269773f8097" containerID="a42f03a318dba23533f4c912f47f8130c181c5cc6bdaa48dc08d536336be4c8e" exitCode=0 Mar 14 05:57:44 crc kubenswrapper[4713]: I0314 05:57:44.409989 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cv4bn" event={"ID":"0230bf57-ad22-415a-bc91-2269773f8097","Type":"ContainerDied","Data":"a42f03a318dba23533f4c912f47f8130c181c5cc6bdaa48dc08d536336be4c8e"} Mar 14 05:57:45 crc kubenswrapper[4713]: I0314 05:57:45.633828 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 05:57:45 crc kubenswrapper[4713]: I0314 05:57:45.755887 4713 scope.go:117] "RemoveContainer" containerID="04c7abccf019758bc4e9c5a0ffe67663b19f2a49a307a6a4dac9ba5787cbe312" Mar 14 05:57:45 crc kubenswrapper[4713]: I0314 05:57:45.834430 4713 scope.go:117] "RemoveContainer" containerID="492f85d8eb589d263332a6812524e48f4fe17fbd7412ad95587490cc6b501a72" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.045605 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.119130 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle\") pod \"0230bf57-ad22-415a-bc91-2269773f8097\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.119477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data\") pod \"0230bf57-ad22-415a-bc91-2269773f8097\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.119632 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg2rl\" (UniqueName: \"kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl\") pod \"0230bf57-ad22-415a-bc91-2269773f8097\" (UID: \"0230bf57-ad22-415a-bc91-2269773f8097\") " Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.128681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl" (OuterVolumeSpecName: "kube-api-access-mg2rl") pod "0230bf57-ad22-415a-bc91-2269773f8097" (UID: "0230bf57-ad22-415a-bc91-2269773f8097"). InnerVolumeSpecName "kube-api-access-mg2rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.223939 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg2rl\" (UniqueName: \"kubernetes.io/projected/0230bf57-ad22-415a-bc91-2269773f8097-kube-api-access-mg2rl\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.284310 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0230bf57-ad22-415a-bc91-2269773f8097" (UID: "0230bf57-ad22-415a-bc91-2269773f8097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.326802 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.336810 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data" (OuterVolumeSpecName: "config-data") pod "0230bf57-ad22-415a-bc91-2269773f8097" (UID: "0230bf57-ad22-415a-bc91-2269773f8097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.431621 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0230bf57-ad22-415a-bc91-2269773f8097-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.506732 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cv4bn" event={"ID":"0230bf57-ad22-415a-bc91-2269773f8097","Type":"ContainerDied","Data":"806ab68e54b900241edc1e4665286a508c665c4f7b5ba652e7c5a989bf3f9186"} Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.506841 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806ab68e54b900241edc1e4665286a508c665c4f7b5ba652e7c5a989bf3f9186" Mar 14 05:57:46 crc kubenswrapper[4713]: I0314 05:57:46.506843 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cv4bn" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.522535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"bd0395a1990f66cb6d99e088c548f943054f350eec3636be7b07d8ea6d5e8ac0"} Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.564126 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7243675769999998 podStartE2EDuration="41.564106993s" podCreationTimestamp="2026-03-14 05:57:06 +0000 UTC" firstStartedPulling="2026-03-14 05:57:07.271916605 +0000 UTC m=+1810.359825905" lastFinishedPulling="2026-03-14 05:57:46.111656021 +0000 UTC m=+1849.199565321" observedRunningTime="2026-03-14 05:57:47.561024959 +0000 UTC m=+1850.648934259" watchObservedRunningTime="2026-03-14 05:57:47.564106993 +0000 UTC m=+1850.652016293" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.602554 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:57:47 crc kubenswrapper[4713]: E0314 05:57:47.602806 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.661328 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6d66999447-xvgnl"] Mar 14 05:57:47 crc kubenswrapper[4713]: E0314 05:57:47.662244 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0230bf57-ad22-415a-bc91-2269773f8097" containerName="heat-db-sync" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.662274 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0230bf57-ad22-415a-bc91-2269773f8097" containerName="heat-db-sync" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.662546 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0230bf57-ad22-415a-bc91-2269773f8097" containerName="heat-db-sync" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.663806 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.704105 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d66999447-xvgnl"] Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.730975 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-784d6d7c98-s79xh"] Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.732658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.755220 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-784d6d7c98-s79xh"] Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766307 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-combined-ca-bundle\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766412 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data-custom\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766436 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766465 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-public-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766489 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-combined-ca-bundle\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766522 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-internal-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.766628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data-custom\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.767297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkn6d\" (UniqueName: \"kubernetes.io/projected/09cb708d-ad8b-4c22-9003-8e59fa88aa05-kube-api-access-dkn6d\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.767366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2b5\" (UniqueName: \"kubernetes.io/projected/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-kube-api-access-rb2b5\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.790482 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-56f6f749f8-hrbrr"] Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.792415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.811516 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56f6f749f8-hrbrr"] Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.869943 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-internal-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870193 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data-custom\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870267 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn67k\" (UniqueName: \"kubernetes.io/projected/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-kube-api-access-kn67k\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870467 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkn6d\" (UniqueName: \"kubernetes.io/projected/09cb708d-ad8b-4c22-9003-8e59fa88aa05-kube-api-access-dkn6d\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2b5\" (UniqueName: \"kubernetes.io/projected/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-kube-api-access-rb2b5\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870549 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-combined-ca-bundle\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data-custom\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.870905 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-internal-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871196 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-combined-ca-bundle\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data-custom\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871338 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-public-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-combined-ca-bundle\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871427 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-public-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.871520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.881351 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.882495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data-custom\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.885989 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-public-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.886484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-combined-ca-bundle\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.890551 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-config-data-custom\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.895045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-internal-tls-certs\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.897089 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-combined-ca-bundle\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.897501 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cb708d-ad8b-4c22-9003-8e59fa88aa05-config-data\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.904379 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2b5\" (UniqueName: \"kubernetes.io/projected/6367bdc4-f55d-4f50-8b15-d1a05ce279e1-kube-api-access-rb2b5\") pod \"heat-api-784d6d7c98-s79xh\" (UID: \"6367bdc4-f55d-4f50-8b15-d1a05ce279e1\") " pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.919808 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkn6d\" (UniqueName: \"kubernetes.io/projected/09cb708d-ad8b-4c22-9003-8e59fa88aa05-kube-api-access-dkn6d\") pod \"heat-engine-6d66999447-xvgnl\" (UID: \"09cb708d-ad8b-4c22-9003-8e59fa88aa05\") " pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973444 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn67k\" (UniqueName: \"kubernetes.io/projected/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-kube-api-access-kn67k\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-combined-ca-bundle\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973526 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data-custom\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-internal-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.973680 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-public-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.977862 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data-custom\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.977905 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-public-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.981308 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-config-data\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.992099 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-combined-ca-bundle\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.992433 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-internal-tls-certs\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:47 crc kubenswrapper[4713]: I0314 05:57:47.999248 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn67k\" (UniqueName: \"kubernetes.io/projected/6d4f1b6f-a408-4a91-9948-fe6bf54e13a9-kube-api-access-kn67k\") pod \"heat-cfnapi-56f6f749f8-hrbrr\" (UID: \"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9\") " pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:48 crc kubenswrapper[4713]: I0314 05:57:48.030020 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:48 crc kubenswrapper[4713]: I0314 05:57:48.102371 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:48 crc kubenswrapper[4713]: I0314 05:57:48.127594 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:56 crc kubenswrapper[4713]: W0314 05:57:56.201521 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09cb708d_ad8b_4c22_9003_8e59fa88aa05.slice/crio-ea2bbd99241a65f702a3252ef8d0f18657df052f6efd1766bb953d36f6ae8927 WatchSource:0}: Error finding container ea2bbd99241a65f702a3252ef8d0f18657df052f6efd1766bb953d36f6ae8927: Status 404 returned error can't find the container with id ea2bbd99241a65f702a3252ef8d0f18657df052f6efd1766bb953d36f6ae8927 Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.235498 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d66999447-xvgnl"] Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.309395 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-784d6d7c98-s79xh"] Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.358387 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56f6f749f8-hrbrr"] Mar 14 05:57:56 crc kubenswrapper[4713]: W0314 05:57:56.369257 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d4f1b6f_a408_4a91_9948_fe6bf54e13a9.slice/crio-646bd1e110b7dc699ab48d632539fd370b20e9b371eda4d33c07bda0af8a27a1 WatchSource:0}: Error finding container 646bd1e110b7dc699ab48d632539fd370b20e9b371eda4d33c07bda0af8a27a1: Status 404 returned error can't find the container with id 646bd1e110b7dc699ab48d632539fd370b20e9b371eda4d33c07bda0af8a27a1 Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.756494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d6d7c98-s79xh" event={"ID":"6367bdc4-f55d-4f50-8b15-d1a05ce279e1","Type":"ContainerStarted","Data":"348e820db094d6dff464c19325bb0c08cf499119a23c9d7f47966323063bee10"} Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.760323 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" event={"ID":"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9","Type":"ContainerStarted","Data":"646bd1e110b7dc699ab48d632539fd370b20e9b371eda4d33c07bda0af8a27a1"} Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.762958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d66999447-xvgnl" event={"ID":"09cb708d-ad8b-4c22-9003-8e59fa88aa05","Type":"ContainerStarted","Data":"a468911c3d451f71dd1f5c87ca6c59c8a2b4930d4f35e4b65c46b27467441421"} Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.762993 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d66999447-xvgnl" event={"ID":"09cb708d-ad8b-4c22-9003-8e59fa88aa05","Type":"ContainerStarted","Data":"ea2bbd99241a65f702a3252ef8d0f18657df052f6efd1766bb953d36f6ae8927"} Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.763150 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.766709 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" event={"ID":"484a7a0d-8b23-4b9f-a875-843d1d9145a0","Type":"ContainerStarted","Data":"ae6d8577e11a32ede64c205a7e8a4784c61df96a779337d0fa55d9bba12c5fa0"} Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.807947 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6d66999447-xvgnl" podStartSLOduration=9.807926681 podStartE2EDuration="9.807926681s" podCreationTimestamp="2026-03-14 05:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:57:56.782577375 +0000 UTC m=+1859.870486685" watchObservedRunningTime="2026-03-14 05:57:56.807926681 +0000 UTC m=+1859.895835971" Mar 14 05:57:56 crc kubenswrapper[4713]: I0314 05:57:56.826522 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" podStartSLOduration=3.433495079 podStartE2EDuration="15.826499051s" podCreationTimestamp="2026-03-14 05:57:41 +0000 UTC" firstStartedPulling="2026-03-14 05:57:43.133361306 +0000 UTC m=+1846.221270606" lastFinishedPulling="2026-03-14 05:57:55.526365278 +0000 UTC m=+1858.614274578" observedRunningTime="2026-03-14 05:57:56.799832813 +0000 UTC m=+1859.887742133" watchObservedRunningTime="2026-03-14 05:57:56.826499051 +0000 UTC m=+1859.914408351" Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.809015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" event={"ID":"6d4f1b6f-a408-4a91-9948-fe6bf54e13a9","Type":"ContainerStarted","Data":"3091614e9324cff81136c48001b4aaa6e5d37e60610824628fa1cebcf7ddb779"} Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.809672 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.811602 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d6d7c98-s79xh" event={"ID":"6367bdc4-f55d-4f50-8b15-d1a05ce279e1","Type":"ContainerStarted","Data":"6c4a4b1cf752d3eb650dbc9ce399ae45a84055d3239ac15e68ce9477b59c8148"} Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.813432 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.839554 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" podStartSLOduration=10.667593949 podStartE2EDuration="12.839526681s" podCreationTimestamp="2026-03-14 05:57:47 +0000 UTC" firstStartedPulling="2026-03-14 05:57:56.371421358 +0000 UTC m=+1859.459330668" lastFinishedPulling="2026-03-14 05:57:58.5433541 +0000 UTC m=+1861.631263400" observedRunningTime="2026-03-14 05:57:59.827276455 +0000 UTC m=+1862.915185755" watchObservedRunningTime="2026-03-14 05:57:59.839526681 +0000 UTC m=+1862.927435981" Mar 14 05:57:59 crc kubenswrapper[4713]: I0314 05:57:59.862887 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-784d6d7c98-s79xh" podStartSLOduration=10.561829436 podStartE2EDuration="12.862857406s" podCreationTimestamp="2026-03-14 05:57:47 +0000 UTC" firstStartedPulling="2026-03-14 05:57:56.238437361 +0000 UTC m=+1859.326346661" lastFinishedPulling="2026-03-14 05:57:58.539465331 +0000 UTC m=+1861.627374631" observedRunningTime="2026-03-14 05:57:59.858303987 +0000 UTC m=+1862.946213287" watchObservedRunningTime="2026-03-14 05:57:59.862857406 +0000 UTC m=+1862.950766706" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.155711 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557798-xv6b5"] Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.158130 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.159860 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.161266 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.161622 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.178749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557798-xv6b5"] Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.285644 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc\") pod \"auto-csr-approver-29557798-xv6b5\" (UID: \"57f5d119-0ede-4eaa-a5cc-e1be3353385a\") " pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.387543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc\") pod \"auto-csr-approver-29557798-xv6b5\" (UID: \"57f5d119-0ede-4eaa-a5cc-e1be3353385a\") " pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.424047 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc\") pod \"auto-csr-approver-29557798-xv6b5\" (UID: \"57f5d119-0ede-4eaa-a5cc-e1be3353385a\") " pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:00 crc kubenswrapper[4713]: I0314 05:58:00.478410 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:01 crc kubenswrapper[4713]: W0314 05:58:01.003993 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f5d119_0ede_4eaa_a5cc_e1be3353385a.slice/crio-4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539 WatchSource:0}: Error finding container 4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539: Status 404 returned error can't find the container with id 4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539 Mar 14 05:58:01 crc kubenswrapper[4713]: I0314 05:58:01.009708 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557798-xv6b5"] Mar 14 05:58:01 crc kubenswrapper[4713]: I0314 05:58:01.865319 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" event={"ID":"57f5d119-0ede-4eaa-a5cc-e1be3353385a","Type":"ContainerStarted","Data":"4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539"} Mar 14 05:58:02 crc kubenswrapper[4713]: I0314 05:58:02.563823 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:58:02 crc kubenswrapper[4713]: E0314 05:58:02.564282 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:58:02 crc kubenswrapper[4713]: I0314 05:58:02.882505 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" event={"ID":"57f5d119-0ede-4eaa-a5cc-e1be3353385a","Type":"ContainerStarted","Data":"e70ec126e39036e60c7487b73cc482ee4cfa2556fc385def705e88cac3c57e96"} Mar 14 05:58:02 crc kubenswrapper[4713]: I0314 05:58:02.901395 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" podStartSLOduration=2.204794901 podStartE2EDuration="2.901373988s" podCreationTimestamp="2026-03-14 05:58:00 +0000 UTC" firstStartedPulling="2026-03-14 05:58:01.008703138 +0000 UTC m=+1864.096612448" lastFinishedPulling="2026-03-14 05:58:01.705282235 +0000 UTC m=+1864.793191535" observedRunningTime="2026-03-14 05:58:02.89492943 +0000 UTC m=+1865.982838740" watchObservedRunningTime="2026-03-14 05:58:02.901373988 +0000 UTC m=+1865.989283288" Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.902278 4713 generic.go:334] "Generic (PLEG): container finished" podID="b83bd95f-ad77-4c7a-9e24-5d2320c7823d" containerID="571493c6372524a8f2bf0c7974370b3c7c785b7c35124873e784dce3e4183757" exitCode=0 Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.902408 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b83bd95f-ad77-4c7a-9e24-5d2320c7823d","Type":"ContainerDied","Data":"571493c6372524a8f2bf0c7974370b3c7c785b7c35124873e784dce3e4183757"} Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.906031 4713 generic.go:334] "Generic (PLEG): container finished" podID="8b7136fb-37b4-4b12-a917-37f2a708eedd" containerID="c15190dc66e799b776a359f2f596139fd5d0e90179036b66317cf4387a9f71f3" exitCode=0 Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.906124 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8b7136fb-37b4-4b12-a917-37f2a708eedd","Type":"ContainerDied","Data":"c15190dc66e799b776a359f2f596139fd5d0e90179036b66317cf4387a9f71f3"} Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.910142 4713 generic.go:334] "Generic (PLEG): container finished" podID="57f5d119-0ede-4eaa-a5cc-e1be3353385a" containerID="e70ec126e39036e60c7487b73cc482ee4cfa2556fc385def705e88cac3c57e96" exitCode=0 Mar 14 05:58:03 crc kubenswrapper[4713]: I0314 05:58:03.910218 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" event={"ID":"57f5d119-0ede-4eaa-a5cc-e1be3353385a","Type":"ContainerDied","Data":"e70ec126e39036e60c7487b73cc482ee4cfa2556fc385def705e88cac3c57e96"} Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.790358 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-784d6d7c98-s79xh" Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.868943 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.869571 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5cd69694c5-ns9d6" podUID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" containerName="heat-api" containerID="cri-o://e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c" gracePeriod=60 Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.961182 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8b7136fb-37b4-4b12-a917-37f2a708eedd","Type":"ContainerStarted","Data":"7a4bf9d1b6e19632d5adee182edfc42c00ab2ecce6827239e947008e31dcfad2"} Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.961543 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.967086 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b83bd95f-ad77-4c7a-9e24-5d2320c7823d","Type":"ContainerStarted","Data":"6302db44bdff4d04497ead7ec995aa4dc9afab442fdfbd9cfd3abdbc88eff599"} Mar 14 05:58:04 crc kubenswrapper[4713]: I0314 05:58:04.967653 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.005093 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=45.005053036 podStartE2EDuration="45.005053036s" podCreationTimestamp="2026-03-14 05:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:58:04.988500309 +0000 UTC m=+1868.076409609" watchObservedRunningTime="2026-03-14 05:58:05.005053036 +0000 UTC m=+1868.092962336" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.043945 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.043918089 podStartE2EDuration="40.043918089s" podCreationTimestamp="2026-03-14 05:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:58:05.029873938 +0000 UTC m=+1868.117783248" watchObservedRunningTime="2026-03-14 05:58:05.043918089 +0000 UTC m=+1868.131827389" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.608195 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.650359 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc\") pod \"57f5d119-0ede-4eaa-a5cc-e1be3353385a\" (UID: \"57f5d119-0ede-4eaa-a5cc-e1be3353385a\") " Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.667462 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc" (OuterVolumeSpecName: "kube-api-access-ql8kc") pod "57f5d119-0ede-4eaa-a5cc-e1be3353385a" (UID: "57f5d119-0ede-4eaa-a5cc-e1be3353385a"). InnerVolumeSpecName "kube-api-access-ql8kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.757082 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/57f5d119-0ede-4eaa-a5cc-e1be3353385a-kube-api-access-ql8kc\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.810068 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-56f6f749f8-hrbrr" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.887097 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.887385 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-595498c55-cl7wg" podUID="2d297aa0-037f-4330-994c-8075c9126844" containerName="heat-cfnapi" containerID="cri-o://09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e" gracePeriod=60 Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.984649 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" event={"ID":"57f5d119-0ede-4eaa-a5cc-e1be3353385a","Type":"ContainerDied","Data":"4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539"} Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.984709 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4213d1ef7bbc0db350e708ee70f67cbc94f5fd194d6fdd92efe026ca7095b539" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.984717 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557798-xv6b5" Mar 14 05:58:05 crc kubenswrapper[4713]: I0314 05:58:05.992556 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557792-f568x"] Mar 14 05:58:06 crc kubenswrapper[4713]: I0314 05:58:06.018837 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557792-f568x"] Mar 14 05:58:07 crc kubenswrapper[4713]: I0314 05:58:07.578996 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06adc2e2-0e41-49dc-8deb-0674f50a77de" path="/var/lib/kubelet/pods/06adc2e2-0e41-49dc-8deb-0674f50a77de/volumes" Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.008775 4713 generic.go:334] "Generic (PLEG): container finished" podID="484a7a0d-8b23-4b9f-a875-843d1d9145a0" containerID="ae6d8577e11a32ede64c205a7e8a4784c61df96a779337d0fa55d9bba12c5fa0" exitCode=0 Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.008860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" event={"ID":"484a7a0d-8b23-4b9f-a875-843d1d9145a0","Type":"ContainerDied","Data":"ae6d8577e11a32ede64c205a7e8a4784c61df96a779337d0fa55d9bba12c5fa0"} Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.075741 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6d66999447-xvgnl" Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.143700 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.144000 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6979cc54d6-q8q5n" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerName="heat-engine" containerID="cri-o://006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" gracePeriod=60 Mar 14 05:58:08 crc kubenswrapper[4713]: I0314 05:58:08.954018 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.036131 4713 generic.go:334] "Generic (PLEG): container finished" podID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" containerID="e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c" exitCode=0 Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.036334 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cd69694c5-ns9d6" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.036392 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cd69694c5-ns9d6" event={"ID":"19f775ed-2ab4-464b-96c5-ccb0bc3b570d","Type":"ContainerDied","Data":"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c"} Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.036453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cd69694c5-ns9d6" event={"ID":"19f775ed-2ab4-464b-96c5-ccb0bc3b570d","Type":"ContainerDied","Data":"edb19b6275adc1504ca8e508b5090abc7d45a9d4f6f2e16e7c4e80ca3e1f9d43"} Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.036479 4713 scope.go:117] "RemoveContainer" containerID="e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057500 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057683 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057828 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057966 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tth2x\" (UniqueName: \"kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.057989 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle\") pod \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\" (UID: \"19f775ed-2ab4-464b-96c5-ccb0bc3b570d\") " Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.094624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.096047 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x" (OuterVolumeSpecName: "kube-api-access-tth2x") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "kube-api-access-tth2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.121191 4713 scope.go:117] "RemoveContainer" containerID="e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c" Mar 14 05:58:09 crc kubenswrapper[4713]: E0314 05:58:09.125461 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c\": container with ID starting with e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c not found: ID does not exist" containerID="e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.125503 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c"} err="failed to get container status \"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c\": rpc error: code = NotFound desc = could not find container \"e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c\": container with ID starting with e3a983dba5ca857d37939bc6054201fac3f9197f2264520ec75fb93b8e811f9c not found: ID does not exist" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.164844 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.184088 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.184157 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tth2x\" (UniqueName: \"kubernetes.io/projected/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-kube-api-access-tth2x\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.184176 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.186869 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.198908 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data" (OuterVolumeSpecName: "config-data") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.215376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "19f775ed-2ab4-464b-96c5-ccb0bc3b570d" (UID: "19f775ed-2ab4-464b-96c5-ccb0bc3b570d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.287902 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.288195 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.288215 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f775ed-2ab4-464b-96c5-ccb0bc3b570d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.414398 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.438098 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5cd69694c5-ns9d6"] Mar 14 05:58:09 crc kubenswrapper[4713]: I0314 05:58:09.679031 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" path="/var/lib/kubelet/pods/19f775ed-2ab4-464b-96c5-ccb0bc3b570d/volumes" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.054074 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.081529 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.084222 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" event={"ID":"484a7a0d-8b23-4b9f-a875-843d1d9145a0","Type":"ContainerDied","Data":"09d43f3f91c2135b3365aba155eb11d4bfb5c41fde97e1229b1b8c4447aadf54"} Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.084273 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d43f3f91c2135b3365aba155eb11d4bfb5c41fde97e1229b1b8c4447aadf54" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.084325 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.087997 4713 generic.go:334] "Generic (PLEG): container finished" podID="2d297aa0-037f-4330-994c-8075c9126844" containerID="09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e" exitCode=0 Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.088045 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-595498c55-cl7wg" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.088053 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-595498c55-cl7wg" event={"ID":"2d297aa0-037f-4330-994c-8075c9126844","Type":"ContainerDied","Data":"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e"} Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.088105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-595498c55-cl7wg" event={"ID":"2d297aa0-037f-4330-994c-8075c9126844","Type":"ContainerDied","Data":"a9c8b2fc7b00ef995a738ba724774d40330eb00bd930c23f3395296fee969308"} Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.088126 4713 scope.go:117] "RemoveContainer" containerID="09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.112555 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.112658 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc7dd\" (UniqueName: \"kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd\") pod \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.112934 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.112979 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory\") pod \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113029 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4b2f\" (UniqueName: \"kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113142 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs\") pod \"2d297aa0-037f-4330-994c-8075c9126844\" (UID: \"2d297aa0-037f-4330-994c-8075c9126844\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113292 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle\") pod \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.113346 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam\") pod \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\" (UID: \"484a7a0d-8b23-4b9f-a875-843d1d9145a0\") " Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.121691 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f" (OuterVolumeSpecName: "kube-api-access-w4b2f") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "kube-api-access-w4b2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.125572 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.127062 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "484a7a0d-8b23-4b9f-a875-843d1d9145a0" (UID: "484a7a0d-8b23-4b9f-a875-843d1d9145a0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.130550 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd" (OuterVolumeSpecName: "kube-api-access-cc7dd") pod "484a7a0d-8b23-4b9f-a875-843d1d9145a0" (UID: "484a7a0d-8b23-4b9f-a875-843d1d9145a0"). InnerVolumeSpecName "kube-api-access-cc7dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.152657 4713 scope.go:117] "RemoveContainer" containerID="09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e" Mar 14 05:58:10 crc kubenswrapper[4713]: E0314 05:58:10.153166 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e\": container with ID starting with 09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e not found: ID does not exist" containerID="09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.153448 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e"} err="failed to get container status \"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e\": rpc error: code = NotFound desc = could not find container \"09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e\": container with ID starting with 09cc90fec05caa9006c899e2c4d8accc8315e4faf499143ff796dc2efbc53c7e not found: ID does not exist" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.192255 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "484a7a0d-8b23-4b9f-a875-843d1d9145a0" (UID: "484a7a0d-8b23-4b9f-a875-843d1d9145a0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.200758 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory" (OuterVolumeSpecName: "inventory") pod "484a7a0d-8b23-4b9f-a875-843d1d9145a0" (UID: "484a7a0d-8b23-4b9f-a875-843d1d9145a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216482 4713 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216517 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216526 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc7dd\" (UniqueName: \"kubernetes.io/projected/484a7a0d-8b23-4b9f-a875-843d1d9145a0-kube-api-access-cc7dd\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216535 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216545 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/484a7a0d-8b23-4b9f-a875-843d1d9145a0-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.216554 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4b2f\" (UniqueName: \"kubernetes.io/projected/2d297aa0-037f-4330-994c-8075c9126844-kube-api-access-w4b2f\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.238458 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.262555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.271125 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data" (OuterVolumeSpecName: "config-data") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.279362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d297aa0-037f-4330-994c-8075c9126844" (UID: "2d297aa0-037f-4330-994c-8075c9126844"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.318682 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.321329 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.321396 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.321412 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d297aa0-037f-4330-994c-8075c9126844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.479151 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.495521 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-595498c55-cl7wg"] Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.776388 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-qbgdm"] Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.791308 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-qbgdm"] Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.852151 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-z9m2b"] Mar 14 05:58:10 crc kubenswrapper[4713]: E0314 05:58:10.852815 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5d119-0ede-4eaa-a5cc-e1be3353385a" containerName="oc" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.852840 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5d119-0ede-4eaa-a5cc-e1be3353385a" containerName="oc" Mar 14 05:58:10 crc kubenswrapper[4713]: E0314 05:58:10.852857 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d297aa0-037f-4330-994c-8075c9126844" containerName="heat-cfnapi" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.852866 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d297aa0-037f-4330-994c-8075c9126844" containerName="heat-cfnapi" Mar 14 05:58:10 crc kubenswrapper[4713]: E0314 05:58:10.852897 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" containerName="heat-api" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.852906 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" containerName="heat-api" Mar 14 05:58:10 crc kubenswrapper[4713]: E0314 05:58:10.852957 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484a7a0d-8b23-4b9f-a875-843d1d9145a0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.852969 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="484a7a0d-8b23-4b9f-a875-843d1d9145a0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.853256 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d297aa0-037f-4330-994c-8075c9126844" containerName="heat-cfnapi" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.853282 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f775ed-2ab4-464b-96c5-ccb0bc3b570d" containerName="heat-api" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.853296 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="484a7a0d-8b23-4b9f-a875-843d1d9145a0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.853333 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f5d119-0ede-4eaa-a5cc-e1be3353385a" containerName="oc" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.854444 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.857121 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.864914 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-z9m2b"] Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.935670 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7qz\" (UniqueName: \"kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.935741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.936098 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:10 crc kubenswrapper[4713]: I0314 05:58:10.936501 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.039200 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.039313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.039403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7qz\" (UniqueName: \"kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.039434 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.043427 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.046275 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.046625 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.059056 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7qz\" (UniqueName: \"kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz\") pod \"aodh-db-sync-z9m2b\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.174880 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.182763 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t"] Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.184871 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.207554 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.208083 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.208257 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.208406 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.215728 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t"] Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.244625 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.244754 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sc25\" (UniqueName: \"kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.245197 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.349451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.349865 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.350030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sc25\" (UniqueName: \"kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.360239 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.367658 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.381311 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sc25\" (UniqueName: \"kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6m99t\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.581184 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29070117-9a34-485a-bd43-2d2ea1d65e00" path="/var/lib/kubelet/pods/29070117-9a34-485a-bd43-2d2ea1d65e00/volumes" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.582955 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d297aa0-037f-4330-994c-8075c9126844" path="/var/lib/kubelet/pods/2d297aa0-037f-4330-994c-8075c9126844/volumes" Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.676439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:11 crc kubenswrapper[4713]: W0314 05:58:11.790251 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09a1a5f_2ab6_4f6f_ab5e_f608e2e43bb4.slice/crio-da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486 WatchSource:0}: Error finding container da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486: Status 404 returned error can't find the container with id da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486 Mar 14 05:58:11 crc kubenswrapper[4713]: I0314 05:58:11.791988 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-z9m2b"] Mar 14 05:58:12 crc kubenswrapper[4713]: I0314 05:58:12.153368 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-z9m2b" event={"ID":"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4","Type":"ContainerStarted","Data":"da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486"} Mar 14 05:58:12 crc kubenswrapper[4713]: I0314 05:58:12.345702 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t"] Mar 14 05:58:13 crc kubenswrapper[4713]: I0314 05:58:13.169074 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" event={"ID":"b4f93731-ee2b-4013-87a8-0ce7a242f506","Type":"ContainerStarted","Data":"e82db49c0e30f6ed85723741d8dd6d38ee214ced25b4974848b3a7e595984786"} Mar 14 05:58:13 crc kubenswrapper[4713]: I0314 05:58:13.169413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" event={"ID":"b4f93731-ee2b-4013-87a8-0ce7a242f506","Type":"ContainerStarted","Data":"5228b0f0ed2043ce8ba3037e08c3c5694e80bd9f019613cccbdf13ca75c725c7"} Mar 14 05:58:13 crc kubenswrapper[4713]: I0314 05:58:13.205446 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" podStartSLOduration=1.791483841 podStartE2EDuration="2.205428342s" podCreationTimestamp="2026-03-14 05:58:11 +0000 UTC" firstStartedPulling="2026-03-14 05:58:12.351056888 +0000 UTC m=+1875.438966188" lastFinishedPulling="2026-03-14 05:58:12.765001389 +0000 UTC m=+1875.852910689" observedRunningTime="2026-03-14 05:58:13.191924499 +0000 UTC m=+1876.279833819" watchObservedRunningTime="2026-03-14 05:58:13.205428342 +0000 UTC m=+1876.293337642" Mar 14 05:58:14 crc kubenswrapper[4713]: E0314 05:58:14.760423 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:58:14 crc kubenswrapper[4713]: E0314 05:58:14.762135 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:58:14 crc kubenswrapper[4713]: E0314 05:58:14.764857 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 05:58:14 crc kubenswrapper[4713]: E0314 05:58:14.764895 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6979cc54d6-q8q5n" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerName="heat-engine" Mar 14 05:58:15 crc kubenswrapper[4713]: I0314 05:58:15.817019 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b83bd95f-ad77-4c7a-9e24-5d2320c7823d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.24:5671: connect: connection refused" Mar 14 05:58:17 crc kubenswrapper[4713]: I0314 05:58:17.263009 4713 generic.go:334] "Generic (PLEG): container finished" podID="b4f93731-ee2b-4013-87a8-0ce7a242f506" containerID="e82db49c0e30f6ed85723741d8dd6d38ee214ced25b4974848b3a7e595984786" exitCode=0 Mar 14 05:58:17 crc kubenswrapper[4713]: I0314 05:58:17.263102 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" event={"ID":"b4f93731-ee2b-4013-87a8-0ce7a242f506","Type":"ContainerDied","Data":"e82db49c0e30f6ed85723741d8dd6d38ee214ced25b4974848b3a7e595984786"} Mar 14 05:58:17 crc kubenswrapper[4713]: I0314 05:58:17.575757 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:58:17 crc kubenswrapper[4713]: E0314 05:58:17.576274 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:58:18 crc kubenswrapper[4713]: I0314 05:58:18.275770 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-z9m2b" event={"ID":"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4","Type":"ContainerStarted","Data":"49c4eb7f48cc08811555d4532c834ba0d19a5c8de9c78ada217a1c074a393430"} Mar 14 05:58:18 crc kubenswrapper[4713]: I0314 05:58:18.375058 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-z9m2b" podStartSLOduration=2.725288832 podStartE2EDuration="8.375023644s" podCreationTimestamp="2026-03-14 05:58:10 +0000 UTC" firstStartedPulling="2026-03-14 05:58:11.795116652 +0000 UTC m=+1874.883025952" lastFinishedPulling="2026-03-14 05:58:17.444851464 +0000 UTC m=+1880.532760764" observedRunningTime="2026-03-14 05:58:18.302409378 +0000 UTC m=+1881.390318678" watchObservedRunningTime="2026-03-14 05:58:18.375023644 +0000 UTC m=+1881.462932944" Mar 14 05:58:18 crc kubenswrapper[4713]: I0314 05:58:18.930644 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.049931 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam\") pod \"b4f93731-ee2b-4013-87a8-0ce7a242f506\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.050006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory\") pod \"b4f93731-ee2b-4013-87a8-0ce7a242f506\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.050240 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sc25\" (UniqueName: \"kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25\") pod \"b4f93731-ee2b-4013-87a8-0ce7a242f506\" (UID: \"b4f93731-ee2b-4013-87a8-0ce7a242f506\") " Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.056663 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25" (OuterVolumeSpecName: "kube-api-access-5sc25") pod "b4f93731-ee2b-4013-87a8-0ce7a242f506" (UID: "b4f93731-ee2b-4013-87a8-0ce7a242f506"). InnerVolumeSpecName "kube-api-access-5sc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.113496 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b4f93731-ee2b-4013-87a8-0ce7a242f506" (UID: "b4f93731-ee2b-4013-87a8-0ce7a242f506"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.155806 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.155858 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sc25\" (UniqueName: \"kubernetes.io/projected/b4f93731-ee2b-4013-87a8-0ce7a242f506-kube-api-access-5sc25\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.223388 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory" (OuterVolumeSpecName: "inventory") pod "b4f93731-ee2b-4013-87a8-0ce7a242f506" (UID: "b4f93731-ee2b-4013-87a8-0ce7a242f506"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.258957 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4f93731-ee2b-4013-87a8-0ce7a242f506-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.288325 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.288318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6m99t" event={"ID":"b4f93731-ee2b-4013-87a8-0ce7a242f506","Type":"ContainerDied","Data":"5228b0f0ed2043ce8ba3037e08c3c5694e80bd9f019613cccbdf13ca75c725c7"} Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.289219 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5228b0f0ed2043ce8ba3037e08c3c5694e80bd9f019613cccbdf13ca75c725c7" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.382889 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt"] Mar 14 05:58:19 crc kubenswrapper[4713]: E0314 05:58:19.383756 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f93731-ee2b-4013-87a8-0ce7a242f506" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.383777 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f93731-ee2b-4013-87a8-0ce7a242f506" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.383979 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f93731-ee2b-4013-87a8-0ce7a242f506" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.384877 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.389387 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.390189 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.395554 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.395574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.419962 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt"] Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.463691 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.464116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.464297 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khkzc\" (UniqueName: \"kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.464512 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.568124 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khkzc\" (UniqueName: \"kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.568258 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.568445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.568609 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.579470 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.579529 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.584767 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.587803 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khkzc\" (UniqueName: \"kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.598491 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-595498c55-cl7wg" podUID="2d297aa0-037f-4330-994c-8075c9126844" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.237:8000/healthcheck\": dial tcp 10.217.0.237:8000: i/o timeout" Mar 14 05:58:19 crc kubenswrapper[4713]: I0314 05:58:19.736536 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.332653 4713 generic.go:334] "Generic (PLEG): container finished" podID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerID="006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" exitCode=0 Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.332785 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6979cc54d6-q8q5n" event={"ID":"0f319ea1-f399-41ba-81cd-edccb9905c98","Type":"ContainerDied","Data":"006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974"} Mar 14 05:58:20 crc kubenswrapper[4713]: W0314 05:58:20.365856 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399681c2_4d54_4329_9e80_55ae24289ee5.slice/crio-44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790 WatchSource:0}: Error finding container 44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790: Status 404 returned error can't find the container with id 44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790 Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.368269 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt"] Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.467410 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.566251 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.646439 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.716256 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q4bk\" (UniqueName: \"kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk\") pod \"0f319ea1-f399-41ba-81cd-edccb9905c98\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.716731 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data\") pod \"0f319ea1-f399-41ba-81cd-edccb9905c98\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.716910 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom\") pod \"0f319ea1-f399-41ba-81cd-edccb9905c98\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.717059 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle\") pod \"0f319ea1-f399-41ba-81cd-edccb9905c98\" (UID: \"0f319ea1-f399-41ba-81cd-edccb9905c98\") " Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.724485 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk" (OuterVolumeSpecName: "kube-api-access-5q4bk") pod "0f319ea1-f399-41ba-81cd-edccb9905c98" (UID: "0f319ea1-f399-41ba-81cd-edccb9905c98"). InnerVolumeSpecName "kube-api-access-5q4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.750733 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f319ea1-f399-41ba-81cd-edccb9905c98" (UID: "0f319ea1-f399-41ba-81cd-edccb9905c98"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.772025 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f319ea1-f399-41ba-81cd-edccb9905c98" (UID: "0f319ea1-f399-41ba-81cd-edccb9905c98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.842911 4713 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.842941 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.842950 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q4bk\" (UniqueName: \"kubernetes.io/projected/0f319ea1-f399-41ba-81cd-edccb9905c98-kube-api-access-5q4bk\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.881645 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data" (OuterVolumeSpecName: "config-data") pod "0f319ea1-f399-41ba-81cd-edccb9905c98" (UID: "0f319ea1-f399-41ba-81cd-edccb9905c98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:20 crc kubenswrapper[4713]: I0314 05:58:20.957258 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f319ea1-f399-41ba-81cd-edccb9905c98-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.346376 4713 generic.go:334] "Generic (PLEG): container finished" podID="e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" containerID="49c4eb7f48cc08811555d4532c834ba0d19a5c8de9c78ada217a1c074a393430" exitCode=0 Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.346647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-z9m2b" event={"ID":"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4","Type":"ContainerDied","Data":"49c4eb7f48cc08811555d4532c834ba0d19a5c8de9c78ada217a1c074a393430"} Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.349100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" event={"ID":"399681c2-4d54-4329-9e80-55ae24289ee5","Type":"ContainerStarted","Data":"316400ed1e50d7b202302ad4d86b86613ce0ce952e9573ce817f569df7e6d5bc"} Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.349137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" event={"ID":"399681c2-4d54-4329-9e80-55ae24289ee5","Type":"ContainerStarted","Data":"44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790"} Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.353133 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6979cc54d6-q8q5n" event={"ID":"0f319ea1-f399-41ba-81cd-edccb9905c98","Type":"ContainerDied","Data":"841926dee436c6b507d7b1eab5d1c68eef782517c0305ef4874e7740c6b57873"} Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.353180 4713 scope.go:117] "RemoveContainer" containerID="006c8f9692e707cebd32acc9b7ecfe259f967159f41735427a59df8819530974" Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.353345 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6979cc54d6-q8q5n" Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.427694 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" podStartSLOduration=1.890217579 podStartE2EDuration="2.427678148s" podCreationTimestamp="2026-03-14 05:58:19 +0000 UTC" firstStartedPulling="2026-03-14 05:58:20.368758032 +0000 UTC m=+1883.456667342" lastFinishedPulling="2026-03-14 05:58:20.906218601 +0000 UTC m=+1883.994127911" observedRunningTime="2026-03-14 05:58:21.420723166 +0000 UTC m=+1884.508632466" watchObservedRunningTime="2026-03-14 05:58:21.427678148 +0000 UTC m=+1884.515587448" Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.486310 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.510189 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6979cc54d6-q8q5n"] Mar 14 05:58:21 crc kubenswrapper[4713]: I0314 05:58:21.595637 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" path="/var/lib/kubelet/pods/0f319ea1-f399-41ba-81cd-edccb9905c98/volumes" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.774308 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:22 crc kubenswrapper[4713]: E0314 05:58:22.775348 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerName="heat-engine" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.775362 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerName="heat-engine" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.775632 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f319ea1-f399-41ba-81cd-edccb9905c98" containerName="heat-engine" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.777333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.788851 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.856145 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.910806 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.910893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:22 crc kubenswrapper[4713]: I0314 05:58:22.911306 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmn5m\" (UniqueName: \"kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7qz\" (UniqueName: \"kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz\") pod \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013233 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle\") pod \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data\") pod \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013376 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts\") pod \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\" (UID: \"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4\") " Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmn5m\" (UniqueName: \"kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.013948 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.014395 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.014427 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.022403 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts" (OuterVolumeSpecName: "scripts") pod "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" (UID: "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.029480 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmn5m\" (UniqueName: \"kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m\") pod \"redhat-marketplace-kqpr4\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.041529 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz" (OuterVolumeSpecName: "kube-api-access-9t7qz") pod "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" (UID: "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4"). InnerVolumeSpecName "kube-api-access-9t7qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.058460 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" (UID: "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.058921 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data" (OuterVolumeSpecName: "config-data") pod "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" (UID: "e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.116177 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t7qz\" (UniqueName: \"kubernetes.io/projected/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-kube-api-access-9t7qz\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.116224 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.116239 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.116251 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.170589 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.385278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-z9m2b" event={"ID":"e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4","Type":"ContainerDied","Data":"da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486"} Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.385622 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4e06ade3315eb971fa176e17e9dab0476c5c0ccfa50d04c3bf39244440b486" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.385704 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-z9m2b" Mar 14 05:58:23 crc kubenswrapper[4713]: I0314 05:58:23.750961 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:24 crc kubenswrapper[4713]: I0314 05:58:24.408082 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerID="87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad" exitCode=0 Mar 14 05:58:24 crc kubenswrapper[4713]: I0314 05:58:24.408353 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerDied","Data":"87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad"} Mar 14 05:58:24 crc kubenswrapper[4713]: I0314 05:58:24.408382 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerStarted","Data":"5228ac23f3afd7ea1b56379d4c153c2102a69b666c8d0c7659456a2d6e2a5fa8"} Mar 14 05:58:25 crc kubenswrapper[4713]: I0314 05:58:25.421744 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerStarted","Data":"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca"} Mar 14 05:58:25 crc kubenswrapper[4713]: I0314 05:58:25.677871 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" containerID="cri-o://0e7788bf9dfb7f8e79a12f2578730a7fec1eb26cd7487a7c7aa6811cba7114bf" gracePeriod=604795 Mar 14 05:58:25 crc kubenswrapper[4713]: I0314 05:58:25.817548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 05:58:25 crc kubenswrapper[4713]: I0314 05:58:25.997752 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.137:5671: connect: connection refused" Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.022163 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.022555 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-api" containerID="cri-o://8cc1027dd9214f8fb90bd7bfc366e201cdc93cfceadd0588fd52f884f9dba7d2" gracePeriod=30 Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.022662 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-notifier" containerID="cri-o://b01489335a9f4953f3cf8fce2d42cb7094033bbe51ce9acf26a53f8be2847286" gracePeriod=30 Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.022670 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-listener" containerID="cri-o://e9c7f7748f525e49b7b2ee98aa1f7bd3ec75aab6eedc2b06c418287d3545e11c" gracePeriod=30 Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.022756 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-evaluator" containerID="cri-o://64e7c847a0c6a09878a30dc88abdff28469857d22754a37ef3fd692f63cda61d" gracePeriod=30 Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.439320 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerID="8cc1027dd9214f8fb90bd7bfc366e201cdc93cfceadd0588fd52f884f9dba7d2" exitCode=0 Mar 14 05:58:26 crc kubenswrapper[4713]: I0314 05:58:26.439445 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerDied","Data":"8cc1027dd9214f8fb90bd7bfc366e201cdc93cfceadd0588fd52f884f9dba7d2"} Mar 14 05:58:27 crc kubenswrapper[4713]: I0314 05:58:27.455073 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerID="64e7c847a0c6a09878a30dc88abdff28469857d22754a37ef3fd692f63cda61d" exitCode=0 Mar 14 05:58:27 crc kubenswrapper[4713]: I0314 05:58:27.455174 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerDied","Data":"64e7c847a0c6a09878a30dc88abdff28469857d22754a37ef3fd692f63cda61d"} Mar 14 05:58:27 crc kubenswrapper[4713]: I0314 05:58:27.458263 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerID="8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca" exitCode=0 Mar 14 05:58:27 crc kubenswrapper[4713]: I0314 05:58:27.458309 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerDied","Data":"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca"} Mar 14 05:58:28 crc kubenswrapper[4713]: I0314 05:58:28.471590 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerStarted","Data":"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297"} Mar 14 05:58:28 crc kubenswrapper[4713]: I0314 05:58:28.495233 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqpr4" podStartSLOduration=3.039915541 podStartE2EDuration="6.495190121s" podCreationTimestamp="2026-03-14 05:58:22 +0000 UTC" firstStartedPulling="2026-03-14 05:58:24.411025889 +0000 UTC m=+1887.498935189" lastFinishedPulling="2026-03-14 05:58:27.866300469 +0000 UTC m=+1890.954209769" observedRunningTime="2026-03-14 05:58:28.49356126 +0000 UTC m=+1891.581470580" watchObservedRunningTime="2026-03-14 05:58:28.495190121 +0000 UTC m=+1891.583099421" Mar 14 05:58:29 crc kubenswrapper[4713]: I0314 05:58:29.486056 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerID="b01489335a9f4953f3cf8fce2d42cb7094033bbe51ce9acf26a53f8be2847286" exitCode=0 Mar 14 05:58:29 crc kubenswrapper[4713]: I0314 05:58:29.486125 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerDied","Data":"b01489335a9f4953f3cf8fce2d42cb7094033bbe51ce9acf26a53f8be2847286"} Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.514993 4713 generic.go:334] "Generic (PLEG): container finished" podID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerID="e9c7f7748f525e49b7b2ee98aa1f7bd3ec75aab6eedc2b06c418287d3545e11c" exitCode=0 Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.515056 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerDied","Data":"e9c7f7748f525e49b7b2ee98aa1f7bd3ec75aab6eedc2b06c418287d3545e11c"} Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.568059 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:58:30 crc kubenswrapper[4713]: E0314 05:58:30.568416 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.700510 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.836797 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.837229 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.837347 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnxf4\" (UniqueName: \"kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.837368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.837423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.837456 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts\") pod \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\" (UID: \"8d161301-7e1c-4b23-a6d4-f250cd1ff761\") " Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.851830 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts" (OuterVolumeSpecName: "scripts") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.867837 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4" (OuterVolumeSpecName: "kube-api-access-mnxf4") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "kube-api-access-mnxf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.908993 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.935380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.940188 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnxf4\" (UniqueName: \"kubernetes.io/projected/8d161301-7e1c-4b23-a6d4-f250cd1ff761-kube-api-access-mnxf4\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.940241 4713 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.940255 4713 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.940267 4713 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:30 crc kubenswrapper[4713]: I0314 05:58:30.999753 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data" (OuterVolumeSpecName: "config-data") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.021511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d161301-7e1c-4b23-a6d4-f250cd1ff761" (UID: "8d161301-7e1c-4b23-a6d4-f250cd1ff761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.042417 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.042456 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d161301-7e1c-4b23-a6d4-f250cd1ff761-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.527255 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8d161301-7e1c-4b23-a6d4-f250cd1ff761","Type":"ContainerDied","Data":"39547d820722c0c30217aa7ae9c2a722a14049e6069bc445cfe034931a6125ce"} Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.527585 4713 scope.go:117] "RemoveContainer" containerID="e9c7f7748f525e49b7b2ee98aa1f7bd3ec75aab6eedc2b06c418287d3545e11c" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.527325 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.553660 4713 scope.go:117] "RemoveContainer" containerID="b01489335a9f4953f3cf8fce2d42cb7094033bbe51ce9acf26a53f8be2847286" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.577392 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.585097 4713 scope.go:117] "RemoveContainer" containerID="64e7c847a0c6a09878a30dc88abdff28469857d22754a37ef3fd692f63cda61d" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.591217 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.646789 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:31 crc kubenswrapper[4713]: E0314 05:58:31.648036 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-api" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.648073 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-api" Mar 14 05:58:31 crc kubenswrapper[4713]: E0314 05:58:31.648119 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-notifier" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.648130 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-notifier" Mar 14 05:58:31 crc kubenswrapper[4713]: E0314 05:58:31.648168 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" containerName="aodh-db-sync" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.648178 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" containerName="aodh-db-sync" Mar 14 05:58:31 crc kubenswrapper[4713]: E0314 05:58:31.648229 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-listener" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.648240 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-listener" Mar 14 05:58:31 crc kubenswrapper[4713]: E0314 05:58:31.648272 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-evaluator" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.648286 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-evaluator" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.649036 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-listener" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.649087 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-api" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.649119 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" containerName="aodh-db-sync" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.649137 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-notifier" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.649158 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" containerName="aodh-evaluator" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.652730 4713 scope.go:117] "RemoveContainer" containerID="8cc1027dd9214f8fb90bd7bfc366e201cdc93cfceadd0588fd52f884f9dba7d2" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.658611 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.664400 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.665475 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.665663 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.666776 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2n7nm" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.666950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.698737 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768476 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-public-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768548 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-internal-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768644 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-config-data\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768670 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-scripts\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768691 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxwr\" (UniqueName: \"kubernetes.io/projected/0567902a-8618-4a33-b632-ef2b6555c113-kube-api-access-jfxwr\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.768737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.871100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.871419 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-public-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.872106 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-internal-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.872310 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-config-data\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.872361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-scripts\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.872395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxwr\" (UniqueName: \"kubernetes.io/projected/0567902a-8618-4a33-b632-ef2b6555c113-kube-api-access-jfxwr\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.880499 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-config-data\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.881644 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-public-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.883805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-internal-tls-certs\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.888129 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.891736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0567902a-8618-4a33-b632-ef2b6555c113-scripts\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.902805 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxwr\" (UniqueName: \"kubernetes.io/projected/0567902a-8618-4a33-b632-ef2b6555c113-kube-api-access-jfxwr\") pod \"aodh-0\" (UID: \"0567902a-8618-4a33-b632-ef2b6555c113\") " pod="openstack/aodh-0" Mar 14 05:58:31 crc kubenswrapper[4713]: I0314 05:58:31.997432 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.550806 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerID="0e7788bf9dfb7f8e79a12f2578730a7fec1eb26cd7487a7c7aa6811cba7114bf" exitCode=0 Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.550896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerDied","Data":"0e7788bf9dfb7f8e79a12f2578730a7fec1eb26cd7487a7c7aa6811cba7114bf"} Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.598502 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 05:58:32 crc kubenswrapper[4713]: E0314 05:58:32.612344 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0567902a_8618_4a33_b632_ef2b6555c113.slice/crio-e4149cdda5da610fceaafde2a3dc4976da4df57325d384e7f7349ec375cfebd2\": RecentStats: unable to find data in memory cache]" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.730655 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.898854 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.898911 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.898976 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899009 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899180 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899323 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899351 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.899375 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxjq\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.904309 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.904409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data\") pod \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\" (UID: \"b8fef26f-0e1b-4e81-8969-a4b972708cb3\") " Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.904873 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.905326 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.905537 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.905562 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.905866 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.909820 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.912306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.916511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq" (OuterVolumeSpecName: "kube-api-access-jvxjq") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "kube-api-access-jvxjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.933375 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 05:58:32 crc kubenswrapper[4713]: I0314 05:58:32.957776 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac" (OuterVolumeSpecName: "persistence") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "pvc-ddd15150-6757-49af-bb52-e399dadc9dac". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:32.998296 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:32.999804 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data" (OuterVolumeSpecName: "config-data") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007331 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxjq\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-kube-api-access-jvxjq\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007379 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") on node \"crc\" " Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007391 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007403 4713 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8fef26f-0e1b-4e81-8969-a4b972708cb3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007411 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007421 4713 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007430 4713 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8fef26f-0e1b-4e81-8969-a4b972708cb3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.007437 4713 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8fef26f-0e1b-4e81-8969-a4b972708cb3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.057645 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.057839 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ddd15150-6757-49af-bb52-e399dadc9dac" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac") on node "crc" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.082403 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8fef26f-0e1b-4e81-8969-a4b972708cb3" (UID: "b8fef26f-0e1b-4e81-8969-a4b972708cb3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.112098 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8fef26f-0e1b-4e81-8969-a4b972708cb3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.112143 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.169938 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.169998 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.239848 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.569488 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.584448 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d161301-7e1c-4b23-a6d4-f250cd1ff761" path="/var/lib/kubelet/pods/8d161301-7e1c-4b23-a6d4-f250cd1ff761/volumes" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.585773 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b8fef26f-0e1b-4e81-8969-a4b972708cb3","Type":"ContainerDied","Data":"1dca3daa524affe675e6a88e697e16dfbd60477868421a0126c4ac08a1f13492"} Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.585803 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0567902a-8618-4a33-b632-ef2b6555c113","Type":"ContainerStarted","Data":"1da7f6b458113a43eb26ceed0272a097b682196daef00474cceae97e351c68b4"} Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.585816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0567902a-8618-4a33-b632-ef2b6555c113","Type":"ContainerStarted","Data":"e4149cdda5da610fceaafde2a3dc4976da4df57325d384e7f7349ec375cfebd2"} Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.586301 4713 scope.go:117] "RemoveContainer" containerID="0e7788bf9dfb7f8e79a12f2578730a7fec1eb26cd7487a7c7aa6811cba7114bf" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.630508 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.668678 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.673351 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.698800 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:33 crc kubenswrapper[4713]: E0314 05:58:33.699363 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="setup-container" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.699384 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="setup-container" Mar 14 05:58:33 crc kubenswrapper[4713]: E0314 05:58:33.699415 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.699422 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.699676 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" containerName="rabbitmq" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.701035 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.724122 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.761488 4713 scope.go:117] "RemoveContainer" containerID="d4c0f09392624ed19b999fbb5f7689557216f97a3ec3122b4ec0efdb2a5ceb6e" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.803174 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829280 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829382 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829420 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829506 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88fb9884-c3f2-4186-8161-159d30f0ee62-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829668 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88fb9884-c3f2-4186-8161-159d30f0ee62-pod-info\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829718 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829836 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.829974 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-server-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.830078 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-config-data\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.830132 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b52lk\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-kube-api-access-b52lk\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.830172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931723 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-server-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931858 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-config-data\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b52lk\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-kube-api-access-b52lk\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931954 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.931977 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.932001 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.932047 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88fb9884-c3f2-4186-8161-159d30f0ee62-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.932073 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88fb9884-c3f2-4186-8161-159d30f0ee62-pod-info\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.932094 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.932745 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.933666 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.934610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.935552 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-config-data\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.935656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88fb9884-c3f2-4186-8161-159d30f0ee62-server-conf\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.937515 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88fb9884-c3f2-4186-8161-159d30f0ee62-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.940454 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.940493 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88fb9884-c3f2-4186-8161-159d30f0ee62-pod-info\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.940702 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.940736 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a35ef7d69b40ea4acd6c23ae4e052713a9902ba943c7b732213f20ace37b1946/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.958103 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:33 crc kubenswrapper[4713]: I0314 05:58:33.973537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b52lk\" (UniqueName: \"kubernetes.io/projected/88fb9884-c3f2-4186-8161-159d30f0ee62-kube-api-access-b52lk\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:34 crc kubenswrapper[4713]: I0314 05:58:34.051385 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ddd15150-6757-49af-bb52-e399dadc9dac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd15150-6757-49af-bb52-e399dadc9dac\") pod \"rabbitmq-server-1\" (UID: \"88fb9884-c3f2-4186-8161-159d30f0ee62\") " pod="openstack/rabbitmq-server-1" Mar 14 05:58:34 crc kubenswrapper[4713]: I0314 05:58:34.054787 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 14 05:58:35 crc kubenswrapper[4713]: I0314 05:58:35.468950 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 14 05:58:35 crc kubenswrapper[4713]: I0314 05:58:35.581396 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fef26f-0e1b-4e81-8969-a4b972708cb3" path="/var/lib/kubelet/pods/b8fef26f-0e1b-4e81-8969-a4b972708cb3/volumes" Mar 14 05:58:35 crc kubenswrapper[4713]: I0314 05:58:35.609022 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0567902a-8618-4a33-b632-ef2b6555c113","Type":"ContainerStarted","Data":"a9bd6046ca62248176c811de2286b2b4fba6b944fee813a6ee054e1225689289"} Mar 14 05:58:35 crc kubenswrapper[4713]: I0314 05:58:35.611267 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"88fb9884-c3f2-4186-8161-159d30f0ee62","Type":"ContainerStarted","Data":"e2be3ec9f829733c5acbefbec30ed3b2ae6717b75b26351d61fe7e2ddedd80ac"} Mar 14 05:58:35 crc kubenswrapper[4713]: I0314 05:58:35.611414 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqpr4" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="registry-server" containerID="cri-o://b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297" gracePeriod=2 Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.283692 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.404700 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmn5m\" (UniqueName: \"kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m\") pod \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.405021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities\") pod \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.406119 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities" (OuterVolumeSpecName: "utilities") pod "5a90442a-1fac-4696-b3eb-f852bc0fbde3" (UID: "5a90442a-1fac-4696-b3eb-f852bc0fbde3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.405197 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content\") pod \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\" (UID: \"5a90442a-1fac-4696-b3eb-f852bc0fbde3\") " Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.409332 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.409372 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m" (OuterVolumeSpecName: "kube-api-access-rmn5m") pod "5a90442a-1fac-4696-b3eb-f852bc0fbde3" (UID: "5a90442a-1fac-4696-b3eb-f852bc0fbde3"). InnerVolumeSpecName "kube-api-access-rmn5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.512045 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmn5m\" (UniqueName: \"kubernetes.io/projected/5a90442a-1fac-4696-b3eb-f852bc0fbde3-kube-api-access-rmn5m\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.631892 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0567902a-8618-4a33-b632-ef2b6555c113","Type":"ContainerStarted","Data":"efda8fd46311f79f9ed058efcf9282dd4f5e9346a559a476351140b75e138b12"} Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.639284 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a90442a-1fac-4696-b3eb-f852bc0fbde3" (UID: "5a90442a-1fac-4696-b3eb-f852bc0fbde3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.640299 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerID="b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297" exitCode=0 Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.640339 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerDied","Data":"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297"} Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.640364 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpr4" event={"ID":"5a90442a-1fac-4696-b3eb-f852bc0fbde3","Type":"ContainerDied","Data":"5228ac23f3afd7ea1b56379d4c153c2102a69b666c8d0c7659456a2d6e2a5fa8"} Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.640380 4713 scope.go:117] "RemoveContainer" containerID="b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.640525 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpr4" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.698560 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.710009 4713 scope.go:117] "RemoveContainer" containerID="8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.719073 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a90442a-1fac-4696-b3eb-f852bc0fbde3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 05:58:36 crc kubenswrapper[4713]: I0314 05:58:36.721291 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpr4"] Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.454622 4713 scope.go:117] "RemoveContainer" containerID="87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.486382 4713 scope.go:117] "RemoveContainer" containerID="b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297" Mar 14 05:58:37 crc kubenswrapper[4713]: E0314 05:58:37.486880 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297\": container with ID starting with b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297 not found: ID does not exist" containerID="b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.486915 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297"} err="failed to get container status \"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297\": rpc error: code = NotFound desc = could not find container \"b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297\": container with ID starting with b6c062d25cd77e6a99ea07fa59cda380518544a5014df04b53d464822f03a297 not found: ID does not exist" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.486935 4713 scope.go:117] "RemoveContainer" containerID="8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca" Mar 14 05:58:37 crc kubenswrapper[4713]: E0314 05:58:37.487358 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca\": container with ID starting with 8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca not found: ID does not exist" containerID="8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.487421 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca"} err="failed to get container status \"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca\": rpc error: code = NotFound desc = could not find container \"8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca\": container with ID starting with 8216177d03b259d31bcec1ee3e8b42f0446ddcffeda8b07467124cf97ed889ca not found: ID does not exist" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.487454 4713 scope.go:117] "RemoveContainer" containerID="87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad" Mar 14 05:58:37 crc kubenswrapper[4713]: E0314 05:58:37.488483 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad\": container with ID starting with 87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad not found: ID does not exist" containerID="87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.489553 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad"} err="failed to get container status \"87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad\": rpc error: code = NotFound desc = could not find container \"87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad\": container with ID starting with 87bfda84ad03623886310ad0cb209292af752f5c5bb0528da2cc17a5eab709ad not found: ID does not exist" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.579157 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" path="/var/lib/kubelet/pods/5a90442a-1fac-4696-b3eb-f852bc0fbde3/volumes" Mar 14 05:58:37 crc kubenswrapper[4713]: I0314 05:58:37.663850 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"88fb9884-c3f2-4186-8161-159d30f0ee62","Type":"ContainerStarted","Data":"b714ec6deb032868e30baea22ab144fd34c16dd68638f53fe56d181014988b3f"} Mar 14 05:58:38 crc kubenswrapper[4713]: I0314 05:58:38.681505 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0567902a-8618-4a33-b632-ef2b6555c113","Type":"ContainerStarted","Data":"107191ec61e23c605357915872c91d0b3e24c8085bd8eb76593a2757000fc96b"} Mar 14 05:58:38 crc kubenswrapper[4713]: I0314 05:58:38.713959 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.837678243 podStartE2EDuration="7.713913839s" podCreationTimestamp="2026-03-14 05:58:31 +0000 UTC" firstStartedPulling="2026-03-14 05:58:32.610961011 +0000 UTC m=+1895.698870311" lastFinishedPulling="2026-03-14 05:58:37.487196607 +0000 UTC m=+1900.575105907" observedRunningTime="2026-03-14 05:58:38.702264182 +0000 UTC m=+1901.790173482" watchObservedRunningTime="2026-03-14 05:58:38.713913839 +0000 UTC m=+1901.801823139" Mar 14 05:58:43 crc kubenswrapper[4713]: I0314 05:58:43.564538 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:58:43 crc kubenswrapper[4713]: E0314 05:58:43.565550 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:58:46 crc kubenswrapper[4713]: I0314 05:58:46.379084 4713 scope.go:117] "RemoveContainer" containerID="9bf31c596e86879aa5ffd03aac8798a400347ce413f33b26b8eae8b77c71d596" Mar 14 05:58:46 crc kubenswrapper[4713]: I0314 05:58:46.444030 4713 scope.go:117] "RemoveContainer" containerID="c477c82e94dd15fb3e06044eb41a97f13c532c496167fe75e233aa8daea3bc44" Mar 14 05:58:46 crc kubenswrapper[4713]: I0314 05:58:46.490452 4713 scope.go:117] "RemoveContainer" containerID="49496be45acfeac04b48718ccf5031d2ad533c9dc885cfa26d3bd6d13249bafa" Mar 14 05:58:56 crc kubenswrapper[4713]: I0314 05:58:56.564469 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:58:56 crc kubenswrapper[4713]: E0314 05:58:56.565331 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:59:09 crc kubenswrapper[4713]: I0314 05:59:09.563733 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:59:09 crc kubenswrapper[4713]: E0314 05:59:09.564612 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:59:10 crc kubenswrapper[4713]: I0314 05:59:10.045497 4713 generic.go:334] "Generic (PLEG): container finished" podID="88fb9884-c3f2-4186-8161-159d30f0ee62" containerID="b714ec6deb032868e30baea22ab144fd34c16dd68638f53fe56d181014988b3f" exitCode=0 Mar 14 05:59:10 crc kubenswrapper[4713]: I0314 05:59:10.045589 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"88fb9884-c3f2-4186-8161-159d30f0ee62","Type":"ContainerDied","Data":"b714ec6deb032868e30baea22ab144fd34c16dd68638f53fe56d181014988b3f"} Mar 14 05:59:11 crc kubenswrapper[4713]: I0314 05:59:11.058981 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"88fb9884-c3f2-4186-8161-159d30f0ee62","Type":"ContainerStarted","Data":"6815cdae725c979b7997f5ff7473c23a525ce54ee0d16353f824d3df4d6f1b13"} Mar 14 05:59:11 crc kubenswrapper[4713]: I0314 05:59:11.059603 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 14 05:59:11 crc kubenswrapper[4713]: I0314 05:59:11.093053 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.093028683 podStartE2EDuration="38.093028683s" podCreationTimestamp="2026-03-14 05:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 05:59:11.079483168 +0000 UTC m=+1934.167392488" watchObservedRunningTime="2026-03-14 05:59:11.093028683 +0000 UTC m=+1934.180937983" Mar 14 05:59:20 crc kubenswrapper[4713]: I0314 05:59:20.563892 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:59:20 crc kubenswrapper[4713]: E0314 05:59:20.564462 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:59:24 crc kubenswrapper[4713]: I0314 05:59:24.059449 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 14 05:59:24 crc kubenswrapper[4713]: I0314 05:59:24.117549 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:28 crc kubenswrapper[4713]: I0314 05:59:28.734954 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="rabbitmq" containerID="cri-o://16cccd1c1879208b644d1ce063fc0207893ec42aa0044d8dfb2dcc4ce8e5f02a" gracePeriod=604796 Mar 14 05:59:32 crc kubenswrapper[4713]: I0314 05:59:32.563849 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:59:32 crc kubenswrapper[4713]: E0314 05:59:32.564882 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.358256 4713 generic.go:334] "Generic (PLEG): container finished" podID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerID="16cccd1c1879208b644d1ce063fc0207893ec42aa0044d8dfb2dcc4ce8e5f02a" exitCode=0 Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.358313 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerDied","Data":"16cccd1c1879208b644d1ce063fc0207893ec42aa0044d8dfb2dcc4ce8e5f02a"} Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.485591 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.641438 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.641565 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.641588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.641640 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642371 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642659 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642698 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642751 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642800 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642824 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcxv8\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642856 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.642973 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info\") pod \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\" (UID: \"35c58160-d324-41f9-8c2d-410ba3fb1bb5\") " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.643985 4713 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.644031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.645405 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.650398 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info" (OuterVolumeSpecName: "pod-info") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.650453 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.650483 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8" (OuterVolumeSpecName: "kube-api-access-fcxv8") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "kube-api-access-fcxv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.650557 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.664753 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155" (OuterVolumeSpecName: "persistence") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.712319 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data" (OuterVolumeSpecName: "config-data") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.746831 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf" (OuterVolumeSpecName: "server-conf") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.747944 4713 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35c58160-d324-41f9-8c2d-410ba3fb1bb5-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.747988 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748034 4713 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35c58160-d324-41f9-8c2d-410ba3fb1bb5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748085 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") on node \"crc\" " Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748101 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748114 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748125 4713 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35c58160-d324-41f9-8c2d-410ba3fb1bb5-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748138 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcxv8\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-kube-api-access-fcxv8\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.748150 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.809237 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "35c58160-d324-41f9-8c2d-410ba3fb1bb5" (UID: "35c58160-d324-41f9-8c2d-410ba3fb1bb5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.809989 4713 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.810218 4713 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155") on node "crc" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.850484 4713 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35c58160-d324-41f9-8c2d-410ba3fb1bb5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:35 crc kubenswrapper[4713]: I0314 05:59:35.850517 4713 reconciler_common.go:293] "Volume detached for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") on node \"crc\" DevicePath \"\"" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.373051 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35c58160-d324-41f9-8c2d-410ba3fb1bb5","Type":"ContainerDied","Data":"f82af143d02fd270cb62c42baceaebe3e8073dee67157a598f5ab0c4c9ac9b18"} Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.373551 4713 scope.go:117] "RemoveContainer" containerID="16cccd1c1879208b644d1ce063fc0207893ec42aa0044d8dfb2dcc4ce8e5f02a" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.373741 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.399237 4713 scope.go:117] "RemoveContainer" containerID="6d464f5b92a469a58eed5cef7c3b96f48a6b9cba513745ea0786417981a94f06" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.423043 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.442965 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.461029 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:36 crc kubenswrapper[4713]: E0314 05:59:36.461917 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="registry-server" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.461946 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="registry-server" Mar 14 05:59:36 crc kubenswrapper[4713]: E0314 05:59:36.461983 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="setup-container" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.461991 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="setup-container" Mar 14 05:59:36 crc kubenswrapper[4713]: E0314 05:59:36.462010 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="extract-content" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.462020 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="extract-content" Mar 14 05:59:36 crc kubenswrapper[4713]: E0314 05:59:36.462046 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="rabbitmq" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.462055 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="rabbitmq" Mar 14 05:59:36 crc kubenswrapper[4713]: E0314 05:59:36.462080 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="extract-utilities" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.462088 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="extract-utilities" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.464924 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" containerName="rabbitmq" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.464998 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a90442a-1fac-4696-b3eb-f852bc0fbde3" containerName="registry-server" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.466928 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.496003 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566253 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566452 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566530 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566563 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566617 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566671 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-config-data\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566772 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtv8h\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-kube-api-access-xtv8h\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.566804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668671 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668722 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668793 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668858 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-config-data\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668901 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtv8h\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-kube-api-access-xtv8h\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.668927 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.669039 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.669172 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.669195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.669262 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.669913 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.670238 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.670500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.670505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-config-data\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.671386 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.671847 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.671880 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b6693a899cd4d176d3943cdc406828cdb93f1723f69ef2bee24905b0841bbd1d/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.672639 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.676489 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.677757 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.683671 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.690013 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtv8h\" (UniqueName: \"kubernetes.io/projected/43aa4c5f-72e3-4b1c-8842-09c1af1abcc3-kube-api-access-xtv8h\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.741769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8bca157-42f1-41d3-a3e6-237b62ff0155\") pod \"rabbitmq-server-0\" (UID: \"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3\") " pod="openstack/rabbitmq-server-0" Mar 14 05:59:36 crc kubenswrapper[4713]: I0314 05:59:36.809342 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 05:59:37 crc kubenswrapper[4713]: I0314 05:59:37.324971 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 05:59:37 crc kubenswrapper[4713]: I0314 05:59:37.398190 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3","Type":"ContainerStarted","Data":"72dc2917eca55ebf2178f6c67af1d0061e904eda50455798194c17da0e7916f4"} Mar 14 05:59:37 crc kubenswrapper[4713]: I0314 05:59:37.611986 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c58160-d324-41f9-8c2d-410ba3fb1bb5" path="/var/lib/kubelet/pods/35c58160-d324-41f9-8c2d-410ba3fb1bb5/volumes" Mar 14 05:59:39 crc kubenswrapper[4713]: I0314 05:59:39.424932 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3","Type":"ContainerStarted","Data":"e11dd301fa83fb6fd171b254f73398e09715aa9a6ddbd6292a8f9dee4cc8a274"} Mar 14 05:59:43 crc kubenswrapper[4713]: I0314 05:59:43.564246 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:59:43 crc kubenswrapper[4713]: E0314 05:59:43.565007 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 05:59:56 crc kubenswrapper[4713]: I0314 05:59:56.563988 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 05:59:56 crc kubenswrapper[4713]: E0314 05:59:56.565055 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.163044 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557800-6ppxr"] Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.165983 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.169544 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.169565 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.170989 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.188292 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv"] Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.190531 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.193038 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.194114 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.206171 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557800-6ppxr"] Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.230254 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv"] Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.230431 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hmm\" (UniqueName: \"kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm\") pod \"auto-csr-approver-29557800-6ppxr\" (UID: \"b3dbe397-34ac-4c7f-990e-cec91d6592a4\") " pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.332346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqg4b\" (UniqueName: \"kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.332451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hmm\" (UniqueName: \"kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm\") pod \"auto-csr-approver-29557800-6ppxr\" (UID: \"b3dbe397-34ac-4c7f-990e-cec91d6592a4\") " pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.332636 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.332724 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.351144 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hmm\" (UniqueName: \"kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm\") pod \"auto-csr-approver-29557800-6ppxr\" (UID: \"b3dbe397-34ac-4c7f-990e-cec91d6592a4\") " pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.434941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.435052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.435112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqg4b\" (UniqueName: \"kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.435870 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.439846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.452954 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqg4b\" (UniqueName: \"kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b\") pod \"collect-profiles-29557800-gn6hv\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.489805 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.512500 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:00 crc kubenswrapper[4713]: I0314 06:00:00.994579 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557800-6ppxr"] Mar 14 06:00:01 crc kubenswrapper[4713]: I0314 06:00:01.121572 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv"] Mar 14 06:00:01 crc kubenswrapper[4713]: W0314 06:00:01.123284 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75332f9a_1c54_42cf_8030_525ee3cffca1.slice/crio-a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac WatchSource:0}: Error finding container a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac: Status 404 returned error can't find the container with id a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac Mar 14 06:00:01 crc kubenswrapper[4713]: I0314 06:00:01.698426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" event={"ID":"75332f9a-1c54-42cf-8030-525ee3cffca1","Type":"ContainerStarted","Data":"9c4f1e8a29a3f0aca04e04e6cb918dcfad20423f2d7a61a9830d5a1ed0b35ad7"} Mar 14 06:00:01 crc kubenswrapper[4713]: I0314 06:00:01.698763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" event={"ID":"75332f9a-1c54-42cf-8030-525ee3cffca1","Type":"ContainerStarted","Data":"a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac"} Mar 14 06:00:01 crc kubenswrapper[4713]: I0314 06:00:01.699732 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" event={"ID":"b3dbe397-34ac-4c7f-990e-cec91d6592a4","Type":"ContainerStarted","Data":"3eca7094f77c1cb57315851544f6d85b892e2ed5c31becbf09def38ba526b868"} Mar 14 06:00:01 crc kubenswrapper[4713]: I0314 06:00:01.717552 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" podStartSLOduration=1.717534659 podStartE2EDuration="1.717534659s" podCreationTimestamp="2026-03-14 06:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 06:00:01.714066149 +0000 UTC m=+1984.801975449" watchObservedRunningTime="2026-03-14 06:00:01.717534659 +0000 UTC m=+1984.805443959" Mar 14 06:00:02 crc kubenswrapper[4713]: I0314 06:00:02.712946 4713 generic.go:334] "Generic (PLEG): container finished" podID="75332f9a-1c54-42cf-8030-525ee3cffca1" containerID="9c4f1e8a29a3f0aca04e04e6cb918dcfad20423f2d7a61a9830d5a1ed0b35ad7" exitCode=0 Mar 14 06:00:02 crc kubenswrapper[4713]: I0314 06:00:02.713012 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" event={"ID":"75332f9a-1c54-42cf-8030-525ee3cffca1","Type":"ContainerDied","Data":"9c4f1e8a29a3f0aca04e04e6cb918dcfad20423f2d7a61a9830d5a1ed0b35ad7"} Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.227293 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.350002 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume\") pod \"75332f9a-1c54-42cf-8030-525ee3cffca1\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.350058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume\") pod \"75332f9a-1c54-42cf-8030-525ee3cffca1\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.350467 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqg4b\" (UniqueName: \"kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b\") pod \"75332f9a-1c54-42cf-8030-525ee3cffca1\" (UID: \"75332f9a-1c54-42cf-8030-525ee3cffca1\") " Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.352452 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume" (OuterVolumeSpecName: "config-volume") pod "75332f9a-1c54-42cf-8030-525ee3cffca1" (UID: "75332f9a-1c54-42cf-8030-525ee3cffca1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.371479 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75332f9a-1c54-42cf-8030-525ee3cffca1" (UID: "75332f9a-1c54-42cf-8030-525ee3cffca1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.371615 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b" (OuterVolumeSpecName: "kube-api-access-qqg4b") pod "75332f9a-1c54-42cf-8030-525ee3cffca1" (UID: "75332f9a-1c54-42cf-8030-525ee3cffca1"). InnerVolumeSpecName "kube-api-access-qqg4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.454721 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75332f9a-1c54-42cf-8030-525ee3cffca1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.454787 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75332f9a-1c54-42cf-8030-525ee3cffca1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.454803 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqg4b\" (UniqueName: \"kubernetes.io/projected/75332f9a-1c54-42cf-8030-525ee3cffca1-kube-api-access-qqg4b\") on node \"crc\" DevicePath \"\"" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.739229 4713 generic.go:334] "Generic (PLEG): container finished" podID="b3dbe397-34ac-4c7f-990e-cec91d6592a4" containerID="24c1d9107c0fea6a541652c613b17559b142ac7f709d4a2c23933ff79704a398" exitCode=0 Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.739639 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" event={"ID":"b3dbe397-34ac-4c7f-990e-cec91d6592a4","Type":"ContainerDied","Data":"24c1d9107c0fea6a541652c613b17559b142ac7f709d4a2c23933ff79704a398"} Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.742318 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" event={"ID":"75332f9a-1c54-42cf-8030-525ee3cffca1","Type":"ContainerDied","Data":"a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac"} Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.742364 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a059545dc31748583fcdb02fb5ccb49ffd56bb4df1bb9ae7bcab868fd2ed39ac" Mar 14 06:00:04 crc kubenswrapper[4713]: I0314 06:00:04.742415 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv" Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.267156 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.402569 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9hmm\" (UniqueName: \"kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm\") pod \"b3dbe397-34ac-4c7f-990e-cec91d6592a4\" (UID: \"b3dbe397-34ac-4c7f-990e-cec91d6592a4\") " Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.410699 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm" (OuterVolumeSpecName: "kube-api-access-g9hmm") pod "b3dbe397-34ac-4c7f-990e-cec91d6592a4" (UID: "b3dbe397-34ac-4c7f-990e-cec91d6592a4"). InnerVolumeSpecName "kube-api-access-g9hmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.506154 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9hmm\" (UniqueName: \"kubernetes.io/projected/b3dbe397-34ac-4c7f-990e-cec91d6592a4-kube-api-access-g9hmm\") on node \"crc\" DevicePath \"\"" Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.766218 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" event={"ID":"b3dbe397-34ac-4c7f-990e-cec91d6592a4","Type":"ContainerDied","Data":"3eca7094f77c1cb57315851544f6d85b892e2ed5c31becbf09def38ba526b868"} Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.766536 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eca7094f77c1cb57315851544f6d85b892e2ed5c31becbf09def38ba526b868" Mar 14 06:00:06 crc kubenswrapper[4713]: I0314 06:00:06.766435 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557800-6ppxr" Mar 14 06:00:07 crc kubenswrapper[4713]: I0314 06:00:07.341917 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557794-zhr59"] Mar 14 06:00:07 crc kubenswrapper[4713]: I0314 06:00:07.354911 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557794-zhr59"] Mar 14 06:00:07 crc kubenswrapper[4713]: I0314 06:00:07.577050 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80784b9-9a04-4aca-9515-d9540532b039" path="/var/lib/kubelet/pods/d80784b9-9a04-4aca-9515-d9540532b039/volumes" Mar 14 06:00:10 crc kubenswrapper[4713]: I0314 06:00:10.564339 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 06:00:10 crc kubenswrapper[4713]: E0314 06:00:10.565090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:00:12 crc kubenswrapper[4713]: I0314 06:00:12.824280 4713 generic.go:334] "Generic (PLEG): container finished" podID="43aa4c5f-72e3-4b1c-8842-09c1af1abcc3" containerID="e11dd301fa83fb6fd171b254f73398e09715aa9a6ddbd6292a8f9dee4cc8a274" exitCode=0 Mar 14 06:00:12 crc kubenswrapper[4713]: I0314 06:00:12.824383 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3","Type":"ContainerDied","Data":"e11dd301fa83fb6fd171b254f73398e09715aa9a6ddbd6292a8f9dee4cc8a274"} Mar 14 06:00:13 crc kubenswrapper[4713]: I0314 06:00:13.837512 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"43aa4c5f-72e3-4b1c-8842-09c1af1abcc3","Type":"ContainerStarted","Data":"e478a6cf76a8ad0311012b3324de0c2199d73fccc890f32142faac92514a39f4"} Mar 14 06:00:13 crc kubenswrapper[4713]: I0314 06:00:13.838230 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 06:00:13 crc kubenswrapper[4713]: I0314 06:00:13.869345 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.869324002 podStartE2EDuration="37.869324002s" podCreationTimestamp="2026-03-14 05:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 06:00:13.861807206 +0000 UTC m=+1996.949716536" watchObservedRunningTime="2026-03-14 06:00:13.869324002 +0000 UTC m=+1996.957233302" Mar 14 06:00:25 crc kubenswrapper[4713]: I0314 06:00:25.564321 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 06:00:25 crc kubenswrapper[4713]: E0314 06:00:25.565095 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:00:26 crc kubenswrapper[4713]: I0314 06:00:26.814451 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 06:00:40 crc kubenswrapper[4713]: I0314 06:00:40.564327 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 06:00:40 crc kubenswrapper[4713]: E0314 06:00:40.566261 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:00:46 crc kubenswrapper[4713]: I0314 06:00:46.775005 4713 scope.go:117] "RemoveContainer" containerID="3f498a7d7eab4a99bb1cd3cb28fcb2e2b1bc211f98ac82859d3017abc3c1e54c" Mar 14 06:00:46 crc kubenswrapper[4713]: I0314 06:00:46.806415 4713 scope.go:117] "RemoveContainer" containerID="d179be8d99d176a187710044c0e144849dff020f4f067946d4be94af6f17ba4a" Mar 14 06:00:54 crc kubenswrapper[4713]: I0314 06:00:54.564031 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 06:00:55 crc kubenswrapper[4713]: I0314 06:00:55.303289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147"} Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.048017 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-a0d1-account-create-update-nzqcb"] Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.077479 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c7l2k"] Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.095315 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-a0d1-account-create-update-nzqcb"] Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.118541 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c7l2k"] Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.577788 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dc5163-22bf-4b59-a601-2ca96749dead" path="/var/lib/kubelet/pods/57dc5163-22bf-4b59-a601-2ca96749dead/volumes" Mar 14 06:00:59 crc kubenswrapper[4713]: I0314 06:00:59.581830 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3603f0e-10ee-4007-a5b0-83dfacefa9d4" path="/var/lib/kubelet/pods/d3603f0e-10ee-4007-a5b0-83dfacefa9d4/volumes" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.154107 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557801-7h79h"] Mar 14 06:01:00 crc kubenswrapper[4713]: E0314 06:01:00.155122 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75332f9a-1c54-42cf-8030-525ee3cffca1" containerName="collect-profiles" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.155142 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="75332f9a-1c54-42cf-8030-525ee3cffca1" containerName="collect-profiles" Mar 14 06:01:00 crc kubenswrapper[4713]: E0314 06:01:00.155181 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dbe397-34ac-4c7f-990e-cec91d6592a4" containerName="oc" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.155189 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dbe397-34ac-4c7f-990e-cec91d6592a4" containerName="oc" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.155557 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3dbe397-34ac-4c7f-990e-cec91d6592a4" containerName="oc" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.155609 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="75332f9a-1c54-42cf-8030-525ee3cffca1" containerName="collect-profiles" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.156692 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.170658 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557801-7h79h"] Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.276169 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.276286 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xn69\" (UniqueName: \"kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.276419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.276532 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.379199 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.379330 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.379403 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xn69\" (UniqueName: \"kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.379524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.387242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.387279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.387795 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.402636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xn69\" (UniqueName: \"kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69\") pod \"keystone-cron-29557801-7h79h\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:00 crc kubenswrapper[4713]: I0314 06:01:00.495549 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:01 crc kubenswrapper[4713]: I0314 06:01:01.023440 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557801-7h79h"] Mar 14 06:01:01 crc kubenswrapper[4713]: I0314 06:01:01.372390 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557801-7h79h" event={"ID":"38b13150-e5f8-4a0e-93a6-c0f07c7e600e","Type":"ContainerStarted","Data":"ee2893ffb094129504e91b6131fe58f5395dcb2ee40bbd90ccc09e826516ce58"} Mar 14 06:01:01 crc kubenswrapper[4713]: I0314 06:01:01.372710 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557801-7h79h" event={"ID":"38b13150-e5f8-4a0e-93a6-c0f07c7e600e","Type":"ContainerStarted","Data":"c0af30ef26f43211055d7450de5ade67cc0b4b98415f8a3a792e7b8c20b6091e"} Mar 14 06:01:01 crc kubenswrapper[4713]: I0314 06:01:01.391827 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557801-7h79h" podStartSLOduration=1.391803789 podStartE2EDuration="1.391803789s" podCreationTimestamp="2026-03-14 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 06:01:01.389565728 +0000 UTC m=+2044.477475068" watchObservedRunningTime="2026-03-14 06:01:01.391803789 +0000 UTC m=+2044.479713089" Mar 14 06:01:04 crc kubenswrapper[4713]: I0314 06:01:04.409633 4713 generic.go:334] "Generic (PLEG): container finished" podID="38b13150-e5f8-4a0e-93a6-c0f07c7e600e" containerID="ee2893ffb094129504e91b6131fe58f5395dcb2ee40bbd90ccc09e826516ce58" exitCode=0 Mar 14 06:01:04 crc kubenswrapper[4713]: I0314 06:01:04.409748 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557801-7h79h" event={"ID":"38b13150-e5f8-4a0e-93a6-c0f07c7e600e","Type":"ContainerDied","Data":"ee2893ffb094129504e91b6131fe58f5395dcb2ee40bbd90ccc09e826516ce58"} Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.040957 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bb1d-account-create-update-btnkl"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.059410 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a8cb-account-create-update-jbn8g"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.070960 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-blfw6"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.082029 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5m68v"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.091947 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bb1d-account-create-update-btnkl"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.102533 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a8cb-account-create-update-jbn8g"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.113244 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b8a0-account-create-update-rxmjz"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.123760 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5m68v"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.133583 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-blfw6"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.144042 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b8a0-account-create-update-rxmjz"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.155771 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bw7wg"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.170583 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bw7wg"] Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.588798 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1" path="/var/lib/kubelet/pods/0d93c4d7-bd1e-4cfd-a007-b3aa1d12c7a1/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.593990 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1024d39c-ff15-450a-a55d-c0d673c3a8de" path="/var/lib/kubelet/pods/1024d39c-ff15-450a-a55d-c0d673c3a8de/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.596121 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597dff88-51ea-4b05-8548-c11611e05914" path="/var/lib/kubelet/pods/597dff88-51ea-4b05-8548-c11611e05914/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.599016 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7493cc88-066e-42ce-880d-5544ba4b0b39" path="/var/lib/kubelet/pods/7493cc88-066e-42ce-880d-5544ba4b0b39/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.602241 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86240a8e-7ae2-4cd5-a608-ed6986152ef9" path="/var/lib/kubelet/pods/86240a8e-7ae2-4cd5-a608-ed6986152ef9/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.604231 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5774e38-4c05-4f83-9b36-714a22a03d0d" path="/var/lib/kubelet/pods/b5774e38-4c05-4f83-9b36-714a22a03d0d/volumes" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.806637 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.924762 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle\") pod \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.924960 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys\") pod \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.924983 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data\") pod \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.925015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xn69\" (UniqueName: \"kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69\") pod \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\" (UID: \"38b13150-e5f8-4a0e-93a6-c0f07c7e600e\") " Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.931681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "38b13150-e5f8-4a0e-93a6-c0f07c7e600e" (UID: "38b13150-e5f8-4a0e-93a6-c0f07c7e600e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.931815 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69" (OuterVolumeSpecName: "kube-api-access-8xn69") pod "38b13150-e5f8-4a0e-93a6-c0f07c7e600e" (UID: "38b13150-e5f8-4a0e-93a6-c0f07c7e600e"). InnerVolumeSpecName "kube-api-access-8xn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.965231 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38b13150-e5f8-4a0e-93a6-c0f07c7e600e" (UID: "38b13150-e5f8-4a0e-93a6-c0f07c7e600e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:01:05 crc kubenswrapper[4713]: I0314 06:01:05.989591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data" (OuterVolumeSpecName: "config-data") pod "38b13150-e5f8-4a0e-93a6-c0f07c7e600e" (UID: "38b13150-e5f8-4a0e-93a6-c0f07c7e600e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.029229 4713 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.029278 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.029289 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xn69\" (UniqueName: \"kubernetes.io/projected/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-kube-api-access-8xn69\") on node \"crc\" DevicePath \"\"" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.029302 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38b13150-e5f8-4a0e-93a6-c0f07c7e600e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.439411 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557801-7h79h" event={"ID":"38b13150-e5f8-4a0e-93a6-c0f07c7e600e","Type":"ContainerDied","Data":"c0af30ef26f43211055d7450de5ade67cc0b4b98415f8a3a792e7b8c20b6091e"} Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.439819 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0af30ef26f43211055d7450de5ade67cc0b4b98415f8a3a792e7b8c20b6091e" Mar 14 06:01:06 crc kubenswrapper[4713]: I0314 06:01:06.439557 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557801-7h79h" Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.050477 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-25kgx"] Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.081903 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v"] Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.099018 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-03ff-account-create-update-wrwt9"] Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.112152 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-03ff-account-create-update-wrwt9"] Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.123750 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-mhq2v"] Mar 14 06:01:12 crc kubenswrapper[4713]: I0314 06:01:12.134558 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-25kgx"] Mar 14 06:01:13 crc kubenswrapper[4713]: I0314 06:01:13.577956 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f655395-3e0b-452b-a92a-16ac7edb5707" path="/var/lib/kubelet/pods/5f655395-3e0b-452b-a92a-16ac7edb5707/volumes" Mar 14 06:01:13 crc kubenswrapper[4713]: I0314 06:01:13.578947 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2264856-3255-43b9-9fd5-50dd1f7d5797" path="/var/lib/kubelet/pods/c2264856-3255-43b9-9fd5-50dd1f7d5797/volumes" Mar 14 06:01:13 crc kubenswrapper[4713]: I0314 06:01:13.579676 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50144ed-c64d-4618-9ac5-afcf5aea1812" path="/var/lib/kubelet/pods/f50144ed-c64d-4618-9ac5-afcf5aea1812/volumes" Mar 14 06:01:41 crc kubenswrapper[4713]: I0314 06:01:41.043478 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-k57vf"] Mar 14 06:01:41 crc kubenswrapper[4713]: I0314 06:01:41.059524 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-k57vf"] Mar 14 06:01:41 crc kubenswrapper[4713]: I0314 06:01:41.576717 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4f6d50-5931-4dec-82ed-606d0a53fb6e" path="/var/lib/kubelet/pods/eb4f6d50-5931-4dec-82ed-606d0a53fb6e/volumes" Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.029854 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bqmlg"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.041472 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-pfctt"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.053040 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-aeae-account-create-update-5rh7j"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.063458 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c7b9-account-create-update-8bt6b"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.073803 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-pfctt"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.086897 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-aeae-account-create-update-5rh7j"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.098815 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bqmlg"] Mar 14 06:01:42 crc kubenswrapper[4713]: I0314 06:01:42.111900 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c7b9-account-create-update-8bt6b"] Mar 14 06:01:43 crc kubenswrapper[4713]: I0314 06:01:43.579477 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02460855-3f34-4eeb-b287-3b2a5fb94d89" path="/var/lib/kubelet/pods/02460855-3f34-4eeb-b287-3b2a5fb94d89/volumes" Mar 14 06:01:43 crc kubenswrapper[4713]: I0314 06:01:43.580656 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9b1fa4-a9bd-434e-aee8-3ca38abfd427" path="/var/lib/kubelet/pods/6f9b1fa4-a9bd-434e-aee8-3ca38abfd427/volumes" Mar 14 06:01:43 crc kubenswrapper[4713]: I0314 06:01:43.581828 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897c3cbc-a12b-486f-887e-59b4d6e37f42" path="/var/lib/kubelet/pods/897c3cbc-a12b-486f-887e-59b4d6e37f42/volumes" Mar 14 06:01:43 crc kubenswrapper[4713]: I0314 06:01:43.583565 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd" path="/var/lib/kubelet/pods/b9b3edfe-0d39-4d51-9650-14b2cf6eb3bd/volumes" Mar 14 06:01:46 crc kubenswrapper[4713]: I0314 06:01:46.919484 4713 scope.go:117] "RemoveContainer" containerID="b789bc5a97672113bc496f3ba7a6133874097474ec2ff68ab4098765c81dd3d7" Mar 14 06:01:46 crc kubenswrapper[4713]: I0314 06:01:46.958637 4713 scope.go:117] "RemoveContainer" containerID="900b57803c150fa7f7d5259939e95ff7c73678ab02eb25cff96418b1217ef2bf" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.013980 4713 scope.go:117] "RemoveContainer" containerID="99284074a67e217fd7bba1e817575f53c525035b8bf7dad3ca0327acde0dac45" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.048313 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a7fc-account-create-update-7pwn5"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.071937 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5p56m"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.099458 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-da85-account-create-update-dnlj4"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.099782 4713 scope.go:117] "RemoveContainer" containerID="a21e67a562cec14af7da9e09edb66b2cc3e7548a15c59e814c467a77281edd21" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.112527 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8cm8b"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.124805 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a7fc-account-create-update-7pwn5"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.135153 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5p56m"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.147956 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-da85-account-create-update-dnlj4"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.152041 4713 scope.go:117] "RemoveContainer" containerID="20011c95aeccd5bb5f40ebe9c34709dc507e19f8af4c4da321b7588f818b892f" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.162045 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8cm8b"] Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.218731 4713 scope.go:117] "RemoveContainer" containerID="484baeff2604aaaea6f8e0fb3e73f32235d5013349a6e15e4dbd13359b25f5d5" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.243106 4713 scope.go:117] "RemoveContainer" containerID="77d472d7bd1b0c1c2a37b871ae49f45d40c3269946e317cda0e4bb46ee6f5e5a" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.304552 4713 scope.go:117] "RemoveContainer" containerID="42f2a5e13832434ef374dcf60639148caadce7c2d8e31df28a354261d6184932" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.333023 4713 scope.go:117] "RemoveContainer" containerID="307cdc6c35584efdf33e740cd56d094a673973aebf34901985ccf490976e227f" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.366746 4713 scope.go:117] "RemoveContainer" containerID="3effbde9bf7f8bee2421c71f66aec38766c89897bd8eb23f9fe9af65fd35b541" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.394268 4713 scope.go:117] "RemoveContainer" containerID="3764ae1af181ba0be89bfab90046d4136615a2cb3347388420d15d69c9b07f05" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.426365 4713 scope.go:117] "RemoveContainer" containerID="51d42e8f7874016a02750cc5ed46e5ace9e5c9d2e7df4f0e0842bc73f90e1203" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.454267 4713 scope.go:117] "RemoveContainer" containerID="89dfe6b3081314697d43fece08c5ec9c83ad0c9e79cf16386926da53b558d228" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.476838 4713 scope.go:117] "RemoveContainer" containerID="8b3defe155af2ce3acd26995fa8925c1eea7a2928ce865f2a261f341c3a442d6" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.512703 4713 scope.go:117] "RemoveContainer" containerID="fb2f44f477d8431377ea7d4e65887deb9c1c517a62285744a683a9967a40bb03" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.542448 4713 scope.go:117] "RemoveContainer" containerID="c4d7bf3a50ded96d88f830871f6dc232b457a9ec4ef2ec1970a2b11160bdd85d" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.576077 4713 scope.go:117] "RemoveContainer" containerID="89db566da35dcb73d2dc6c608c23426d2ac6e7da9f7d335441285fb319f440f4" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.586628 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166c7fc9-cfdb-448b-8a3e-2915689f014e" path="/var/lib/kubelet/pods/166c7fc9-cfdb-448b-8a3e-2915689f014e/volumes" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.587305 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279d8812-ca1b-4f1e-a094-072076726e8c" path="/var/lib/kubelet/pods/279d8812-ca1b-4f1e-a094-072076726e8c/volumes" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.588299 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af90c93b-5f4e-41c0-8a65-1a480062a11f" path="/var/lib/kubelet/pods/af90c93b-5f4e-41c0-8a65-1a480062a11f/volumes" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.589358 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda8f62b-2f71-482e-8189-ce9ec768da83" path="/var/lib/kubelet/pods/bda8f62b-2f71-482e-8189-ce9ec768da83/volumes" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.598864 4713 scope.go:117] "RemoveContainer" containerID="279157b447635d024fadc3bd2a98465d4867b3b3b30d9555f8b42045f4909f73" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.627319 4713 scope.go:117] "RemoveContainer" containerID="88e7070f953f9280c7f2352f2b58f94f91f7a04459a7c836d939181da0645465" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.652248 4713 scope.go:117] "RemoveContainer" containerID="1a44d6756bdab94bc8e566d791d5907f49904cdbd29506a3b8e8519aa3086522" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.673384 4713 scope.go:117] "RemoveContainer" containerID="03f656d121c7b385c651f978df47e7b8d3a9bd88e9828ea906180c09be2d8d02" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.706388 4713 scope.go:117] "RemoveContainer" containerID="124fa9d6943941875505a667614e4fbc8701d12fa3947c9be1d76d76e9aee269" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.736132 4713 scope.go:117] "RemoveContainer" containerID="3eaa43710026ffc438f6bc547e3e6ca4a5f16f637dced7e35ee2b54255c537d6" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.781990 4713 scope.go:117] "RemoveContainer" containerID="cfe54626451b26876a8fa637d0196c3a247154ff67988db7f58cf5b1db5d4b1a" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.803165 4713 scope.go:117] "RemoveContainer" containerID="4e39a3745d98fd2e2c4c33f967ae16aa87edf948fcbcd5f76b1699a2270821aa" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.825351 4713 scope.go:117] "RemoveContainer" containerID="f5b7814e09ece1d77110d47ef200c49c637eac9a9ab47d2f394126768a7a087f" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.845533 4713 scope.go:117] "RemoveContainer" containerID="9eae25a1f97b97f263464a970d67909a37d58436ffb95b919d1a7b642f84de77" Mar 14 06:01:47 crc kubenswrapper[4713]: I0314 06:01:47.882898 4713 scope.go:117] "RemoveContainer" containerID="68dbafeb9318baf2c027b788f3e66dffc0959a917f4bf3aec64bc8a4c9535ef8" Mar 14 06:01:57 crc kubenswrapper[4713]: I0314 06:01:57.034667 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-g2xdv"] Mar 14 06:01:57 crc kubenswrapper[4713]: I0314 06:01:57.048246 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-g2xdv"] Mar 14 06:01:57 crc kubenswrapper[4713]: I0314 06:01:57.603270 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f0d8e7-9d94-46ae-a721-d4557e09a0e9" path="/var/lib/kubelet/pods/b9f0d8e7-9d94-46ae-a721-d4557e09a0e9/volumes" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.065613 4713 generic.go:334] "Generic (PLEG): container finished" podID="399681c2-4d54-4329-9e80-55ae24289ee5" containerID="316400ed1e50d7b202302ad4d86b86613ce0ce952e9573ce817f569df7e6d5bc" exitCode=0 Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.067197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" event={"ID":"399681c2-4d54-4329-9e80-55ae24289ee5","Type":"ContainerDied","Data":"316400ed1e50d7b202302ad4d86b86613ce0ce952e9573ce817f569df7e6d5bc"} Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.148490 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557802-8lwbs"] Mar 14 06:02:00 crc kubenswrapper[4713]: E0314 06:02:00.149221 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b13150-e5f8-4a0e-93a6-c0f07c7e600e" containerName="keystone-cron" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.149244 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b13150-e5f8-4a0e-93a6-c0f07c7e600e" containerName="keystone-cron" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.149510 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b13150-e5f8-4a0e-93a6-c0f07c7e600e" containerName="keystone-cron" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.150679 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.153133 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.156980 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.157162 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.166059 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557802-8lwbs"] Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.251417 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmm7w\" (UniqueName: \"kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w\") pod \"auto-csr-approver-29557802-8lwbs\" (UID: \"2a8806b4-7007-42b9-b55b-2392aab57894\") " pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.353991 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmm7w\" (UniqueName: \"kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w\") pod \"auto-csr-approver-29557802-8lwbs\" (UID: \"2a8806b4-7007-42b9-b55b-2392aab57894\") " pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.376623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmm7w\" (UniqueName: \"kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w\") pod \"auto-csr-approver-29557802-8lwbs\" (UID: \"2a8806b4-7007-42b9-b55b-2392aab57894\") " pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:00 crc kubenswrapper[4713]: I0314 06:02:00.478311 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.047042 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557802-8lwbs"] Mar 14 06:02:01 crc kubenswrapper[4713]: W0314 06:02:01.048625 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8806b4_7007_42b9_b55b_2392aab57894.slice/crio-0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201 WatchSource:0}: Error finding container 0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201: Status 404 returned error can't find the container with id 0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201 Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.078039 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" event={"ID":"2a8806b4-7007-42b9-b55b-2392aab57894","Type":"ContainerStarted","Data":"0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201"} Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.543719 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.588622 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khkzc\" (UniqueName: \"kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc\") pod \"399681c2-4d54-4329-9e80-55ae24289ee5\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.588909 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle\") pod \"399681c2-4d54-4329-9e80-55ae24289ee5\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.589110 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory\") pod \"399681c2-4d54-4329-9e80-55ae24289ee5\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.589265 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam\") pod \"399681c2-4d54-4329-9e80-55ae24289ee5\" (UID: \"399681c2-4d54-4329-9e80-55ae24289ee5\") " Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.595591 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "399681c2-4d54-4329-9e80-55ae24289ee5" (UID: "399681c2-4d54-4329-9e80-55ae24289ee5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.597372 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc" (OuterVolumeSpecName: "kube-api-access-khkzc") pod "399681c2-4d54-4329-9e80-55ae24289ee5" (UID: "399681c2-4d54-4329-9e80-55ae24289ee5"). InnerVolumeSpecName "kube-api-access-khkzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.627264 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "399681c2-4d54-4329-9e80-55ae24289ee5" (UID: "399681c2-4d54-4329-9e80-55ae24289ee5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.636327 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory" (OuterVolumeSpecName: "inventory") pod "399681c2-4d54-4329-9e80-55ae24289ee5" (UID: "399681c2-4d54-4329-9e80-55ae24289ee5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.693244 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.693450 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.693581 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khkzc\" (UniqueName: \"kubernetes.io/projected/399681c2-4d54-4329-9e80-55ae24289ee5-kube-api-access-khkzc\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:01 crc kubenswrapper[4713]: I0314 06:02:01.693642 4713 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399681c2-4d54-4329-9e80-55ae24289ee5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.088499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" event={"ID":"2a8806b4-7007-42b9-b55b-2392aab57894","Type":"ContainerStarted","Data":"e5fba5c615ba9c66775f99bcbb12687420fb521a0f71bd92b08f265b9ff80244"} Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.090831 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" event={"ID":"399681c2-4d54-4329-9e80-55ae24289ee5","Type":"ContainerDied","Data":"44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790"} Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.090880 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c5f4296f5973ecd0ba6c8041ad428bebb1fc9eb07f1e5a4d35a57ad30ef790" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.090872 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.105682 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" podStartSLOduration=1.34731967 podStartE2EDuration="2.105665434s" podCreationTimestamp="2026-03-14 06:02:00 +0000 UTC" firstStartedPulling="2026-03-14 06:02:01.050942212 +0000 UTC m=+2104.138851512" lastFinishedPulling="2026-03-14 06:02:01.809287976 +0000 UTC m=+2104.897197276" observedRunningTime="2026-03-14 06:02:02.101169371 +0000 UTC m=+2105.189078671" watchObservedRunningTime="2026-03-14 06:02:02.105665434 +0000 UTC m=+2105.193574734" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.162408 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk"] Mar 14 06:02:02 crc kubenswrapper[4713]: E0314 06:02:02.163160 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399681c2-4d54-4329-9e80-55ae24289ee5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.163940 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="399681c2-4d54-4329-9e80-55ae24289ee5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.164265 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="399681c2-4d54-4329-9e80-55ae24289ee5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.165239 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.168254 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.168335 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.168530 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.169028 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.179522 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk"] Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.211690 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w7rl\" (UniqueName: \"kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.211802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.211900 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.314151 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w7rl\" (UniqueName: \"kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.314319 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.314452 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.319045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.319743 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.331813 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w7rl\" (UniqueName: \"kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:02 crc kubenswrapper[4713]: I0314 06:02:02.487597 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:02:03 crc kubenswrapper[4713]: I0314 06:02:03.109378 4713 generic.go:334] "Generic (PLEG): container finished" podID="2a8806b4-7007-42b9-b55b-2392aab57894" containerID="e5fba5c615ba9c66775f99bcbb12687420fb521a0f71bd92b08f265b9ff80244" exitCode=0 Mar 14 06:02:03 crc kubenswrapper[4713]: I0314 06:02:03.109432 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" event={"ID":"2a8806b4-7007-42b9-b55b-2392aab57894","Type":"ContainerDied","Data":"e5fba5c615ba9c66775f99bcbb12687420fb521a0f71bd92b08f265b9ff80244"} Mar 14 06:02:03 crc kubenswrapper[4713]: I0314 06:02:03.113186 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk"] Mar 14 06:02:03 crc kubenswrapper[4713]: W0314 06:02:03.121663 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58fd92f9_e4e0_4d3a_8df0_dc21754faea3.slice/crio-87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded WatchSource:0}: Error finding container 87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded: Status 404 returned error can't find the container with id 87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded Mar 14 06:02:03 crc kubenswrapper[4713]: I0314 06:02:03.124482 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.123651 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" event={"ID":"58fd92f9-e4e0-4d3a-8df0-dc21754faea3","Type":"ContainerStarted","Data":"6176524e54f263f49a0d5b523438b5f2032eeebb24b5bf521586af621e1d890d"} Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.124025 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" event={"ID":"58fd92f9-e4e0-4d3a-8df0-dc21754faea3","Type":"ContainerStarted","Data":"87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded"} Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.157269 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" podStartSLOduration=1.754400799 podStartE2EDuration="2.157250152s" podCreationTimestamp="2026-03-14 06:02:02 +0000 UTC" firstStartedPulling="2026-03-14 06:02:03.124200944 +0000 UTC m=+2106.212110234" lastFinishedPulling="2026-03-14 06:02:03.527050287 +0000 UTC m=+2106.614959587" observedRunningTime="2026-03-14 06:02:04.13844187 +0000 UTC m=+2107.226351160" watchObservedRunningTime="2026-03-14 06:02:04.157250152 +0000 UTC m=+2107.245159452" Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.651321 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.704257 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmm7w\" (UniqueName: \"kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w\") pod \"2a8806b4-7007-42b9-b55b-2392aab57894\" (UID: \"2a8806b4-7007-42b9-b55b-2392aab57894\") " Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.711763 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w" (OuterVolumeSpecName: "kube-api-access-rmm7w") pod "2a8806b4-7007-42b9-b55b-2392aab57894" (UID: "2a8806b4-7007-42b9-b55b-2392aab57894"). InnerVolumeSpecName "kube-api-access-rmm7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:02:04 crc kubenswrapper[4713]: I0314 06:02:04.808504 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmm7w\" (UniqueName: \"kubernetes.io/projected/2a8806b4-7007-42b9-b55b-2392aab57894-kube-api-access-rmm7w\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.137041 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.138010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557802-8lwbs" event={"ID":"2a8806b4-7007-42b9-b55b-2392aab57894","Type":"ContainerDied","Data":"0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201"} Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.138047 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a20182755e06fdea4c398484d004ee63f51e55a5f37a33b12b9d29226352201" Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.173397 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557796-v46qf"] Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.184735 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557796-v46qf"] Mar 14 06:02:05 crc kubenswrapper[4713]: I0314 06:02:05.610156 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e0467d-86c8-4ec7-af58-a2e9f3c4dd79" path="/var/lib/kubelet/pods/12e0467d-86c8-4ec7-af58-a2e9f3c4dd79/volumes" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.404230 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:33 crc kubenswrapper[4713]: E0314 06:02:33.405652 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8806b4-7007-42b9-b55b-2392aab57894" containerName="oc" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.405677 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8806b4-7007-42b9-b55b-2392aab57894" containerName="oc" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.406128 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8806b4-7007-42b9-b55b-2392aab57894" containerName="oc" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.408676 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.416291 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.598094 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxdc\" (UniqueName: \"kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.598248 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.598418 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.700310 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.701096 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.701408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxdc\" (UniqueName: \"kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.702973 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.703389 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.723722 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxdc\" (UniqueName: \"kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc\") pod \"redhat-operators-ht86m\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:33 crc kubenswrapper[4713]: I0314 06:02:33.736837 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:34 crc kubenswrapper[4713]: I0314 06:02:34.215121 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:34 crc kubenswrapper[4713]: I0314 06:02:34.459674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerStarted","Data":"26ec58dd992bb992fd93f507bd21f19dd1f4c69172ca8f6c9246790bce6d0a17"} Mar 14 06:02:35 crc kubenswrapper[4713]: I0314 06:02:35.489024 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerID="da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2" exitCode=0 Mar 14 06:02:35 crc kubenswrapper[4713]: I0314 06:02:35.489362 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerDied","Data":"da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2"} Mar 14 06:02:36 crc kubenswrapper[4713]: I0314 06:02:36.048600 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pf6rx"] Mar 14 06:02:36 crc kubenswrapper[4713]: I0314 06:02:36.060728 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pf6rx"] Mar 14 06:02:37 crc kubenswrapper[4713]: I0314 06:02:37.510340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerStarted","Data":"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff"} Mar 14 06:02:37 crc kubenswrapper[4713]: I0314 06:02:37.579440 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6b1099-f4ac-4540-b964-334be68df63a" path="/var/lib/kubelet/pods/ea6b1099-f4ac-4540-b964-334be68df63a/volumes" Mar 14 06:02:44 crc kubenswrapper[4713]: I0314 06:02:44.607261 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerID="bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff" exitCode=0 Mar 14 06:02:44 crc kubenswrapper[4713]: I0314 06:02:44.607698 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerDied","Data":"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff"} Mar 14 06:02:45 crc kubenswrapper[4713]: I0314 06:02:45.627544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerStarted","Data":"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604"} Mar 14 06:02:45 crc kubenswrapper[4713]: I0314 06:02:45.649638 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ht86m" podStartSLOduration=3.14964195 podStartE2EDuration="12.649615902s" podCreationTimestamp="2026-03-14 06:02:33 +0000 UTC" firstStartedPulling="2026-03-14 06:02:35.498891906 +0000 UTC m=+2138.586801206" lastFinishedPulling="2026-03-14 06:02:44.998865868 +0000 UTC m=+2148.086775158" observedRunningTime="2026-03-14 06:02:45.645885874 +0000 UTC m=+2148.733795174" watchObservedRunningTime="2026-03-14 06:02:45.649615902 +0000 UTC m=+2148.737525192" Mar 14 06:02:46 crc kubenswrapper[4713]: I0314 06:02:46.048011 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pzsx4"] Mar 14 06:02:46 crc kubenswrapper[4713]: I0314 06:02:46.058770 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pzsx4"] Mar 14 06:02:47 crc kubenswrapper[4713]: I0314 06:02:47.576699 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9b3f3d-45ae-454f-9430-9b69a22a05b4" path="/var/lib/kubelet/pods/da9b3f3d-45ae-454f-9430-9b69a22a05b4/volumes" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.045990 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-44c8n"] Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.085586 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-44c8n"] Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.344645 4713 scope.go:117] "RemoveContainer" containerID="bb77a49d7913dd4497d606f3761b268321a685306a8c9b887a32cff4cab5b47b" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.370671 4713 scope.go:117] "RemoveContainer" containerID="6dbcdc373df2cd984cd4dca3c5a93735843372117e4ed11b91735b53b8e2fe99" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.426923 4713 scope.go:117] "RemoveContainer" containerID="0fc9d4ad5f5c526af06bf8c1b7a8f59df89cb7702b3d4c67df33c2386b50eed1" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.492011 4713 scope.go:117] "RemoveContainer" containerID="17266f7dc0dea3431c2c76e2db30536c71b44274e4700047f49fad0a009b00e9" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.572097 4713 scope.go:117] "RemoveContainer" containerID="71ae1a7c8b695285fb54b047398b7eb8afbf227c61ced8b82fbdf8e02c301614" Mar 14 06:02:48 crc kubenswrapper[4713]: I0314 06:02:48.639617 4713 scope.go:117] "RemoveContainer" containerID="b73163b5d670b2cff92d2108ac79295448b493456a92cdb9d3fdbf4f96518c09" Mar 14 06:02:49 crc kubenswrapper[4713]: I0314 06:02:49.578363 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e724ed74-dc1e-43d9-84f2-e774c7d969bf" path="/var/lib/kubelet/pods/e724ed74-dc1e-43d9-84f2-e774c7d969bf/volumes" Mar 14 06:02:53 crc kubenswrapper[4713]: I0314 06:02:53.737413 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:53 crc kubenswrapper[4713]: I0314 06:02:53.737867 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:53 crc kubenswrapper[4713]: I0314 06:02:53.790244 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:54 crc kubenswrapper[4713]: I0314 06:02:54.795550 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:54 crc kubenswrapper[4713]: I0314 06:02:54.849885 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:56 crc kubenswrapper[4713]: I0314 06:02:56.758122 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ht86m" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="registry-server" containerID="cri-o://3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604" gracePeriod=2 Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.409583 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.484981 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content\") pod \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.485088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities\") pod \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.485235 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjxdc\" (UniqueName: \"kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc\") pod \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\" (UID: \"0ee60055-ea9a-41c0-b2a4-ffeabf02257e\") " Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.487704 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities" (OuterVolumeSpecName: "utilities") pod "0ee60055-ea9a-41c0-b2a4-ffeabf02257e" (UID: "0ee60055-ea9a-41c0-b2a4-ffeabf02257e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.494622 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc" (OuterVolumeSpecName: "kube-api-access-bjxdc") pod "0ee60055-ea9a-41c0-b2a4-ffeabf02257e" (UID: "0ee60055-ea9a-41c0-b2a4-ffeabf02257e"). InnerVolumeSpecName "kube-api-access-bjxdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.588666 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjxdc\" (UniqueName: \"kubernetes.io/projected/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-kube-api-access-bjxdc\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.588704 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.601160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ee60055-ea9a-41c0-b2a4-ffeabf02257e" (UID: "0ee60055-ea9a-41c0-b2a4-ffeabf02257e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.693015 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ee60055-ea9a-41c0-b2a4-ffeabf02257e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.769462 4713 generic.go:334] "Generic (PLEG): container finished" podID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerID="3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604" exitCode=0 Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.769504 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerDied","Data":"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604"} Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.769529 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ht86m" event={"ID":"0ee60055-ea9a-41c0-b2a4-ffeabf02257e","Type":"ContainerDied","Data":"26ec58dd992bb992fd93f507bd21f19dd1f4c69172ca8f6c9246790bce6d0a17"} Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.769543 4713 scope.go:117] "RemoveContainer" containerID="3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.769657 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ht86m" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.803137 4713 scope.go:117] "RemoveContainer" containerID="bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.807060 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.818644 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ht86m"] Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.823437 4713 scope.go:117] "RemoveContainer" containerID="da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.881218 4713 scope.go:117] "RemoveContainer" containerID="3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604" Mar 14 06:02:57 crc kubenswrapper[4713]: E0314 06:02:57.881813 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604\": container with ID starting with 3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604 not found: ID does not exist" containerID="3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.881860 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604"} err="failed to get container status \"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604\": rpc error: code = NotFound desc = could not find container \"3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604\": container with ID starting with 3f4a03359eb7c15995d00989d5e10a1338fac5dd6607408515df0a7b940d0604 not found: ID does not exist" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.881891 4713 scope.go:117] "RemoveContainer" containerID="bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff" Mar 14 06:02:57 crc kubenswrapper[4713]: E0314 06:02:57.882253 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff\": container with ID starting with bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff not found: ID does not exist" containerID="bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.882284 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff"} err="failed to get container status \"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff\": rpc error: code = NotFound desc = could not find container \"bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff\": container with ID starting with bc830c648fb89b6a59124f4cc3f170eaaff047e3d48c1b9a1e49c3d6dda1dbff not found: ID does not exist" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.882305 4713 scope.go:117] "RemoveContainer" containerID="da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2" Mar 14 06:02:57 crc kubenswrapper[4713]: E0314 06:02:57.882660 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2\": container with ID starting with da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2 not found: ID does not exist" containerID="da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2" Mar 14 06:02:57 crc kubenswrapper[4713]: I0314 06:02:57.882685 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2"} err="failed to get container status \"da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2\": rpc error: code = NotFound desc = could not find container \"da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2\": container with ID starting with da7d90f6538648231aaadf0ec945320274aa6cd446e9d97664b5f0c0cfbb21c2 not found: ID does not exist" Mar 14 06:02:59 crc kubenswrapper[4713]: I0314 06:02:59.578574 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" path="/var/lib/kubelet/pods/0ee60055-ea9a-41c0-b2a4-ffeabf02257e/volumes" Mar 14 06:03:00 crc kubenswrapper[4713]: I0314 06:03:00.065266 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-l62vj"] Mar 14 06:03:00 crc kubenswrapper[4713]: I0314 06:03:00.081027 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-l62vj"] Mar 14 06:03:01 crc kubenswrapper[4713]: I0314 06:03:01.036779 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-r6jzk"] Mar 14 06:03:01 crc kubenswrapper[4713]: I0314 06:03:01.051621 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-r6jzk"] Mar 14 06:03:01 crc kubenswrapper[4713]: I0314 06:03:01.580345 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd7eedb-d5e4-4df8-9ff6-717989483135" path="/var/lib/kubelet/pods/0cd7eedb-d5e4-4df8-9ff6-717989483135/volumes" Mar 14 06:03:01 crc kubenswrapper[4713]: I0314 06:03:01.581353 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3e039f-375f-411e-b265-f6188fc80d58" path="/var/lib/kubelet/pods/7d3e039f-375f-411e-b265-f6188fc80d58/volumes" Mar 14 06:03:10 crc kubenswrapper[4713]: I0314 06:03:10.731343 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:03:10 crc kubenswrapper[4713]: I0314 06:03:10.731942 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:03:40 crc kubenswrapper[4713]: I0314 06:03:40.731807 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:03:40 crc kubenswrapper[4713]: I0314 06:03:40.732406 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:03:48 crc kubenswrapper[4713]: I0314 06:03:48.826440 4713 scope.go:117] "RemoveContainer" containerID="ad7930d216d96bbd0b6b2c8fe04e353dd5a9fd9f77f5b49eadf9bae2f0410bcf" Mar 14 06:03:48 crc kubenswrapper[4713]: I0314 06:03:48.869468 4713 scope.go:117] "RemoveContainer" containerID="a314a81594457aabe7d7b756ec1dc15d8bb66ba7e671bd1329baed963bb45a34" Mar 14 06:03:48 crc kubenswrapper[4713]: I0314 06:03:48.933409 4713 scope.go:117] "RemoveContainer" containerID="e1a96dcc5fb019e1db1c1eeabb3ecd04a2f32dd7153079efd4a1d3ada0c6bdd7" Mar 14 06:03:50 crc kubenswrapper[4713]: I0314 06:03:50.045125 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vgcgm"] Mar 14 06:03:50 crc kubenswrapper[4713]: I0314 06:03:50.056486 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vgcgm"] Mar 14 06:03:51 crc kubenswrapper[4713]: I0314 06:03:51.575825 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9da5a8-6d6b-404f-9cb7-3030364e35e2" path="/var/lib/kubelet/pods/bd9da5a8-6d6b-404f-9cb7-3030364e35e2/volumes" Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.036980 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bcce-account-create-update-ftbtx"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.054464 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9002-account-create-update-mlmq6"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.065696 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5n98m"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.075588 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d28jb"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.085443 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-782d-account-create-update-pglqr"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.096265 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-782d-account-create-update-pglqr"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.105978 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9002-account-create-update-mlmq6"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.115234 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5n98m"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.124428 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d28jb"] Mar 14 06:03:52 crc kubenswrapper[4713]: I0314 06:03:52.133398 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bcce-account-create-update-ftbtx"] Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.382228 4713 generic.go:334] "Generic (PLEG): container finished" podID="58fd92f9-e4e0-4d3a-8df0-dc21754faea3" containerID="6176524e54f263f49a0d5b523438b5f2032eeebb24b5bf521586af621e1d890d" exitCode=0 Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.382328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" event={"ID":"58fd92f9-e4e0-4d3a-8df0-dc21754faea3","Type":"ContainerDied","Data":"6176524e54f263f49a0d5b523438b5f2032eeebb24b5bf521586af621e1d890d"} Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.578648 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25be920e-162b-4f60-851b-228167576b04" path="/var/lib/kubelet/pods/25be920e-162b-4f60-851b-228167576b04/volumes" Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.579449 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec0abd6-d181-493f-a285-932a17fac41d" path="/var/lib/kubelet/pods/bec0abd6-d181-493f-a285-932a17fac41d/volumes" Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.580063 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0a922f-47a8-482f-b2e5-b9fb6176c221" path="/var/lib/kubelet/pods/da0a922f-47a8-482f-b2e5-b9fb6176c221/volumes" Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.580718 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b63ac3-87f8-43e2-b546-0cd9d6025e7e" path="/var/lib/kubelet/pods/e3b63ac3-87f8-43e2-b546-0cd9d6025e7e/volumes" Mar 14 06:03:53 crc kubenswrapper[4713]: I0314 06:03:53.581828 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa23366-9513-4a8e-af1f-1b6b7596a5ac" path="/var/lib/kubelet/pods/faa23366-9513-4a8e-af1f-1b6b7596a5ac/volumes" Mar 14 06:03:54 crc kubenswrapper[4713]: I0314 06:03:54.939615 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.065238 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam\") pod \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.065677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory\") pod \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.066072 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w7rl\" (UniqueName: \"kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl\") pod \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\" (UID: \"58fd92f9-e4e0-4d3a-8df0-dc21754faea3\") " Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.075121 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl" (OuterVolumeSpecName: "kube-api-access-7w7rl") pod "58fd92f9-e4e0-4d3a-8df0-dc21754faea3" (UID: "58fd92f9-e4e0-4d3a-8df0-dc21754faea3"). InnerVolumeSpecName "kube-api-access-7w7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.096216 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58fd92f9-e4e0-4d3a-8df0-dc21754faea3" (UID: "58fd92f9-e4e0-4d3a-8df0-dc21754faea3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.101359 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory" (OuterVolumeSpecName: "inventory") pod "58fd92f9-e4e0-4d3a-8df0-dc21754faea3" (UID: "58fd92f9-e4e0-4d3a-8df0-dc21754faea3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.168787 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.168830 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.168845 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w7rl\" (UniqueName: \"kubernetes.io/projected/58fd92f9-e4e0-4d3a-8df0-dc21754faea3-kube-api-access-7w7rl\") on node \"crc\" DevicePath \"\"" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.407118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" event={"ID":"58fd92f9-e4e0-4d3a-8df0-dc21754faea3","Type":"ContainerDied","Data":"87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded"} Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.407185 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b6addd80ae811f10e722b88d9bbc3da4a27c4d6ebc1d7f3a77a33fdd04dded" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.407352 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.495625 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f"] Mar 14 06:03:55 crc kubenswrapper[4713]: E0314 06:03:55.496343 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="extract-utilities" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.496430 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="extract-utilities" Mar 14 06:03:55 crc kubenswrapper[4713]: E0314 06:03:55.496517 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fd92f9-e4e0-4d3a-8df0-dc21754faea3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.496588 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fd92f9-e4e0-4d3a-8df0-dc21754faea3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 06:03:55 crc kubenswrapper[4713]: E0314 06:03:55.496646 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="registry-server" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.496698 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="registry-server" Mar 14 06:03:55 crc kubenswrapper[4713]: E0314 06:03:55.496774 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="extract-content" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.496836 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="extract-content" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.497173 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fd92f9-e4e0-4d3a-8df0-dc21754faea3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.497298 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee60055-ea9a-41c0-b2a4-ffeabf02257e" containerName="registry-server" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.498278 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.501117 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.501276 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.501354 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.503811 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.507238 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f"] Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.679907 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.680616 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.680734 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lptvs\" (UniqueName: \"kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.782975 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.783119 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lptvs\" (UniqueName: \"kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.783506 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.787524 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.791776 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.800268 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lptvs\" (UniqueName: \"kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-txv7f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:55 crc kubenswrapper[4713]: I0314 06:03:55.822017 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:03:56 crc kubenswrapper[4713]: I0314 06:03:56.393681 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f"] Mar 14 06:03:56 crc kubenswrapper[4713]: I0314 06:03:56.421443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" event={"ID":"55f62410-5eca-443c-86b0-39b49d969e9f","Type":"ContainerStarted","Data":"3782194f96e5b70653de392d19bc16031fb1a1dd1fd10daf45b7788b9fe455f8"} Mar 14 06:03:57 crc kubenswrapper[4713]: I0314 06:03:57.432376 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" event={"ID":"55f62410-5eca-443c-86b0-39b49d969e9f","Type":"ContainerStarted","Data":"be47a814df9fe2c0c8b178a4eba22cd568aae3598f6e1f923e5fe45b01a7549a"} Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.146632 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" podStartSLOduration=4.693172391 podStartE2EDuration="5.146611726s" podCreationTimestamp="2026-03-14 06:03:55 +0000 UTC" firstStartedPulling="2026-03-14 06:03:56.397798844 +0000 UTC m=+2219.485708144" lastFinishedPulling="2026-03-14 06:03:56.851238179 +0000 UTC m=+2219.939147479" observedRunningTime="2026-03-14 06:03:57.461374066 +0000 UTC m=+2220.549283366" watchObservedRunningTime="2026-03-14 06:04:00.146611726 +0000 UTC m=+2223.234521026" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.157192 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557804-42lnq"] Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.158718 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.161024 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.161522 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.165076 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.170769 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557804-42lnq"] Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.224771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7nd6\" (UniqueName: \"kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6\") pod \"auto-csr-approver-29557804-42lnq\" (UID: \"21da1e63-cb2b-4604-8d9c-614007e64c5e\") " pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.328053 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7nd6\" (UniqueName: \"kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6\") pod \"auto-csr-approver-29557804-42lnq\" (UID: \"21da1e63-cb2b-4604-8d9c-614007e64c5e\") " pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.350107 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7nd6\" (UniqueName: \"kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6\") pod \"auto-csr-approver-29557804-42lnq\" (UID: \"21da1e63-cb2b-4604-8d9c-614007e64c5e\") " pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.485771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:00 crc kubenswrapper[4713]: I0314 06:04:00.973957 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557804-42lnq"] Mar 14 06:04:00 crc kubenswrapper[4713]: W0314 06:04:00.976554 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21da1e63_cb2b_4604_8d9c_614007e64c5e.slice/crio-4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8 WatchSource:0}: Error finding container 4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8: Status 404 returned error can't find the container with id 4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8 Mar 14 06:04:01 crc kubenswrapper[4713]: I0314 06:04:01.478752 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557804-42lnq" event={"ID":"21da1e63-cb2b-4604-8d9c-614007e64c5e","Type":"ContainerStarted","Data":"4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8"} Mar 14 06:04:02 crc kubenswrapper[4713]: I0314 06:04:02.494753 4713 generic.go:334] "Generic (PLEG): container finished" podID="21da1e63-cb2b-4604-8d9c-614007e64c5e" containerID="b272a9bb621b7373c94ad09998c664d94abbe89e2ff50ee6e650d3afbf514303" exitCode=0 Mar 14 06:04:02 crc kubenswrapper[4713]: I0314 06:04:02.495051 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557804-42lnq" event={"ID":"21da1e63-cb2b-4604-8d9c-614007e64c5e","Type":"ContainerDied","Data":"b272a9bb621b7373c94ad09998c664d94abbe89e2ff50ee6e650d3afbf514303"} Mar 14 06:04:03 crc kubenswrapper[4713]: I0314 06:04:03.900182 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.015711 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7nd6\" (UniqueName: \"kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6\") pod \"21da1e63-cb2b-4604-8d9c-614007e64c5e\" (UID: \"21da1e63-cb2b-4604-8d9c-614007e64c5e\") " Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.025071 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6" (OuterVolumeSpecName: "kube-api-access-k7nd6") pod "21da1e63-cb2b-4604-8d9c-614007e64c5e" (UID: "21da1e63-cb2b-4604-8d9c-614007e64c5e"). InnerVolumeSpecName "kube-api-access-k7nd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.118751 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7nd6\" (UniqueName: \"kubernetes.io/projected/21da1e63-cb2b-4604-8d9c-614007e64c5e-kube-api-access-k7nd6\") on node \"crc\" DevicePath \"\"" Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.518808 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557804-42lnq" event={"ID":"21da1e63-cb2b-4604-8d9c-614007e64c5e","Type":"ContainerDied","Data":"4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8"} Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.518848 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cfe786bcb94fd14f427199173a89c2ce2b89c2c7395d3b31ca2ff740ea489c8" Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.518872 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557804-42lnq" Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.958390 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557798-xv6b5"] Mar 14 06:04:04 crc kubenswrapper[4713]: I0314 06:04:04.969429 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557798-xv6b5"] Mar 14 06:04:05 crc kubenswrapper[4713]: I0314 06:04:05.577655 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f5d119-0ede-4eaa-a5cc-e1be3353385a" path="/var/lib/kubelet/pods/57f5d119-0ede-4eaa-a5cc-e1be3353385a/volumes" Mar 14 06:04:10 crc kubenswrapper[4713]: I0314 06:04:10.732066 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:04:10 crc kubenswrapper[4713]: I0314 06:04:10.732684 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:04:10 crc kubenswrapper[4713]: I0314 06:04:10.732736 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:04:10 crc kubenswrapper[4713]: I0314 06:04:10.733698 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:04:10 crc kubenswrapper[4713]: I0314 06:04:10.733755 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147" gracePeriod=600 Mar 14 06:04:11 crc kubenswrapper[4713]: I0314 06:04:11.637555 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147" exitCode=0 Mar 14 06:04:11 crc kubenswrapper[4713]: I0314 06:04:11.637634 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147"} Mar 14 06:04:11 crc kubenswrapper[4713]: I0314 06:04:11.638408 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df"} Mar 14 06:04:11 crc kubenswrapper[4713]: I0314 06:04:11.638439 4713 scope.go:117] "RemoveContainer" containerID="a6b2b8e5736d99b20c6e517d7c2fe766e98120c8e1776c26bb2c858122728429" Mar 14 06:04:34 crc kubenswrapper[4713]: I0314 06:04:34.046716 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrsf8"] Mar 14 06:04:34 crc kubenswrapper[4713]: I0314 06:04:34.058592 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrsf8"] Mar 14 06:04:35 crc kubenswrapper[4713]: I0314 06:04:35.716861 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd56fa6-325c-4813-bffe-a2cd3bf82257" path="/var/lib/kubelet/pods/0cd56fa6-325c-4813-bffe-a2cd3bf82257/volumes" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.109861 4713 scope.go:117] "RemoveContainer" containerID="3eede5e670de2d5a3dd4e1dcf1db98befac79c13e1dd6156de9f63432c9f9176" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.139483 4713 scope.go:117] "RemoveContainer" containerID="e70ec126e39036e60c7487b73cc482ee4cfa2556fc385def705e88cac3c57e96" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.243102 4713 scope.go:117] "RemoveContainer" containerID="2506a986b947dd8202511e0a11956aa33b328bb708c1d658cc22810b613a5d5e" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.314632 4713 scope.go:117] "RemoveContainer" containerID="a3d2d61e8bf96de324b5402aad0561de0c188360a65a4014eed190ff7dbc5051" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.343867 4713 scope.go:117] "RemoveContainer" containerID="317be9ba6837e17cd3e2c8e35046a0dc9dd0dd7e55036a9016f3e2b8d607e836" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.390272 4713 scope.go:117] "RemoveContainer" containerID="677d1231a67f4dd596979c1b218a784a48b411494858cff15931de060d72051a" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.436352 4713 scope.go:117] "RemoveContainer" containerID="de8508c0f617dfad9f581e264b3e013a501c8184dfcce239f4c95713cab8d3cc" Mar 14 06:04:49 crc kubenswrapper[4713]: I0314 06:04:49.456530 4713 scope.go:117] "RemoveContainer" containerID="f3f13f36c98e8b06153b0ac1947d2634c46b6f2eee47aa5b0eaea2069f209efc" Mar 14 06:04:58 crc kubenswrapper[4713]: I0314 06:04:58.044260 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ncptd"] Mar 14 06:04:58 crc kubenswrapper[4713]: I0314 06:04:58.055507 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ncptd"] Mar 14 06:04:59 crc kubenswrapper[4713]: I0314 06:04:59.576503 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31b3a4b9-569c-4648-88bc-377d83007c53" path="/var/lib/kubelet/pods/31b3a4b9-569c-4648-88bc-377d83007c53/volumes" Mar 14 06:05:01 crc kubenswrapper[4713]: I0314 06:05:01.031815 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xmbd7"] Mar 14 06:05:01 crc kubenswrapper[4713]: I0314 06:05:01.044277 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xmbd7"] Mar 14 06:05:01 crc kubenswrapper[4713]: I0314 06:05:01.588513 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9aceec-678a-410d-be9f-e5a6a1116c0b" path="/var/lib/kubelet/pods/3f9aceec-678a-410d-be9f-e5a6a1116c0b/volumes" Mar 14 06:05:04 crc kubenswrapper[4713]: I0314 06:05:04.034188 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-958lf"] Mar 14 06:05:04 crc kubenswrapper[4713]: I0314 06:05:04.043808 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-958lf"] Mar 14 06:05:05 crc kubenswrapper[4713]: I0314 06:05:05.034977 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-38dc-account-create-update-jstb7"] Mar 14 06:05:05 crc kubenswrapper[4713]: I0314 06:05:05.048164 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-38dc-account-create-update-jstb7"] Mar 14 06:05:05 crc kubenswrapper[4713]: I0314 06:05:05.576970 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66beacbe-5dac-4e64-a18a-092775f976ca" path="/var/lib/kubelet/pods/66beacbe-5dac-4e64-a18a-092775f976ca/volumes" Mar 14 06:05:05 crc kubenswrapper[4713]: I0314 06:05:05.577747 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7058c887-6736-417a-83c0-18e8e9ca53f3" path="/var/lib/kubelet/pods/7058c887-6736-417a-83c0-18e8e9ca53f3/volumes" Mar 14 06:05:08 crc kubenswrapper[4713]: I0314 06:05:08.277079 4713 generic.go:334] "Generic (PLEG): container finished" podID="55f62410-5eca-443c-86b0-39b49d969e9f" containerID="be47a814df9fe2c0c8b178a4eba22cd568aae3598f6e1f923e5fe45b01a7549a" exitCode=0 Mar 14 06:05:08 crc kubenswrapper[4713]: I0314 06:05:08.277121 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" event={"ID":"55f62410-5eca-443c-86b0-39b49d969e9f","Type":"ContainerDied","Data":"be47a814df9fe2c0c8b178a4eba22cd568aae3598f6e1f923e5fe45b01a7549a"} Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.748566 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.851423 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lptvs\" (UniqueName: \"kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs\") pod \"55f62410-5eca-443c-86b0-39b49d969e9f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.852065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory\") pod \"55f62410-5eca-443c-86b0-39b49d969e9f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.852181 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam\") pod \"55f62410-5eca-443c-86b0-39b49d969e9f\" (UID: \"55f62410-5eca-443c-86b0-39b49d969e9f\") " Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.858387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs" (OuterVolumeSpecName: "kube-api-access-lptvs") pod "55f62410-5eca-443c-86b0-39b49d969e9f" (UID: "55f62410-5eca-443c-86b0-39b49d969e9f"). InnerVolumeSpecName "kube-api-access-lptvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.883893 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory" (OuterVolumeSpecName: "inventory") pod "55f62410-5eca-443c-86b0-39b49d969e9f" (UID: "55f62410-5eca-443c-86b0-39b49d969e9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.890725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55f62410-5eca-443c-86b0-39b49d969e9f" (UID: "55f62410-5eca-443c-86b0-39b49d969e9f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.956015 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lptvs\" (UniqueName: \"kubernetes.io/projected/55f62410-5eca-443c-86b0-39b49d969e9f-kube-api-access-lptvs\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.956045 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:09 crc kubenswrapper[4713]: I0314 06:05:09.956057 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55f62410-5eca-443c-86b0-39b49d969e9f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.522573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" event={"ID":"55f62410-5eca-443c-86b0-39b49d969e9f","Type":"ContainerDied","Data":"3782194f96e5b70653de392d19bc16031fb1a1dd1fd10daf45b7788b9fe455f8"} Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.523361 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3782194f96e5b70653de392d19bc16031fb1a1dd1fd10daf45b7788b9fe455f8" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.522642 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-txv7f" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.598970 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb"] Mar 14 06:05:11 crc kubenswrapper[4713]: E0314 06:05:11.599716 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21da1e63-cb2b-4604-8d9c-614007e64c5e" containerName="oc" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.599741 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="21da1e63-cb2b-4604-8d9c-614007e64c5e" containerName="oc" Mar 14 06:05:11 crc kubenswrapper[4713]: E0314 06:05:11.599784 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f62410-5eca-443c-86b0-39b49d969e9f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.599796 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f62410-5eca-443c-86b0-39b49d969e9f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.600165 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f62410-5eca-443c-86b0-39b49d969e9f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.600193 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="21da1e63-cb2b-4604-8d9c-614007e64c5e" containerName="oc" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.601451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.610034 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.610055 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.610046 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.610305 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.628910 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb"] Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.703508 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hldjt\" (UniqueName: \"kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.703575 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.703740 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.805587 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hldjt\" (UniqueName: \"kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.805645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.805787 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.810563 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.812562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.827184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hldjt\" (UniqueName: \"kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:11 crc kubenswrapper[4713]: I0314 06:05:11.933760 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:12 crc kubenswrapper[4713]: I0314 06:05:12.470271 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb"] Mar 14 06:05:12 crc kubenswrapper[4713]: I0314 06:05:12.535310 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" event={"ID":"acd65abe-8ba5-4743-b778-d18f74ca3f2b","Type":"ContainerStarted","Data":"ce80f90afdc71b285e110f3cfdf9f7f5eaad0b0ecdae0d8c60551bf41541129f"} Mar 14 06:05:14 crc kubenswrapper[4713]: I0314 06:05:14.555885 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" event={"ID":"acd65abe-8ba5-4743-b778-d18f74ca3f2b","Type":"ContainerStarted","Data":"66ce7828f0b833a519c5199c01f30e875f29ec9b534ad304e33c60dcc1492107"} Mar 14 06:05:14 crc kubenswrapper[4713]: I0314 06:05:14.572357 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" podStartSLOduration=2.322225893 podStartE2EDuration="3.572334342s" podCreationTimestamp="2026-03-14 06:05:11 +0000 UTC" firstStartedPulling="2026-03-14 06:05:12.478642997 +0000 UTC m=+2295.566552297" lastFinishedPulling="2026-03-14 06:05:13.728751446 +0000 UTC m=+2296.816660746" observedRunningTime="2026-03-14 06:05:14.56906838 +0000 UTC m=+2297.656977680" watchObservedRunningTime="2026-03-14 06:05:14.572334342 +0000 UTC m=+2297.660243642" Mar 14 06:05:21 crc kubenswrapper[4713]: I0314 06:05:21.794626 4713 generic.go:334] "Generic (PLEG): container finished" podID="acd65abe-8ba5-4743-b778-d18f74ca3f2b" containerID="66ce7828f0b833a519c5199c01f30e875f29ec9b534ad304e33c60dcc1492107" exitCode=0 Mar 14 06:05:21 crc kubenswrapper[4713]: I0314 06:05:21.794709 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" event={"ID":"acd65abe-8ba5-4743-b778-d18f74ca3f2b","Type":"ContainerDied","Data":"66ce7828f0b833a519c5199c01f30e875f29ec9b534ad304e33c60dcc1492107"} Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.300231 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.359561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hldjt\" (UniqueName: \"kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt\") pod \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.359840 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam\") pod \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.359894 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory\") pod \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\" (UID: \"acd65abe-8ba5-4743-b778-d18f74ca3f2b\") " Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.365723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt" (OuterVolumeSpecName: "kube-api-access-hldjt") pod "acd65abe-8ba5-4743-b778-d18f74ca3f2b" (UID: "acd65abe-8ba5-4743-b778-d18f74ca3f2b"). InnerVolumeSpecName "kube-api-access-hldjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.404404 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acd65abe-8ba5-4743-b778-d18f74ca3f2b" (UID: "acd65abe-8ba5-4743-b778-d18f74ca3f2b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.419069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory" (OuterVolumeSpecName: "inventory") pod "acd65abe-8ba5-4743-b778-d18f74ca3f2b" (UID: "acd65abe-8ba5-4743-b778-d18f74ca3f2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.462639 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hldjt\" (UniqueName: \"kubernetes.io/projected/acd65abe-8ba5-4743-b778-d18f74ca3f2b-kube-api-access-hldjt\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.462677 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.462694 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd65abe-8ba5-4743-b778-d18f74ca3f2b-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.815809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" event={"ID":"acd65abe-8ba5-4743-b778-d18f74ca3f2b","Type":"ContainerDied","Data":"ce80f90afdc71b285e110f3cfdf9f7f5eaad0b0ecdae0d8c60551bf41541129f"} Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.816131 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce80f90afdc71b285e110f3cfdf9f7f5eaad0b0ecdae0d8c60551bf41541129f" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.815848 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.899239 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2"] Mar 14 06:05:23 crc kubenswrapper[4713]: E0314 06:05:23.899873 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd65abe-8ba5-4743-b778-d18f74ca3f2b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.899902 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd65abe-8ba5-4743-b778-d18f74ca3f2b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.900292 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd65abe-8ba5-4743-b778-d18f74ca3f2b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.901178 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.903901 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.904450 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.904477 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.906752 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.912885 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2"] Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.974831 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.975140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xk2d\" (UniqueName: \"kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:23 crc kubenswrapper[4713]: I0314 06:05:23.975282 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.077479 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.077590 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xk2d\" (UniqueName: \"kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.077658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.082857 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.084889 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.095733 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xk2d\" (UniqueName: \"kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5mtz2\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.229349 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.783092 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2"] Mar 14 06:05:24 crc kubenswrapper[4713]: I0314 06:05:24.827554 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" event={"ID":"aca83833-8133-4064-b1b3-05989d69b5b0","Type":"ContainerStarted","Data":"a1488cc51441d086b39d435b856a749539dbc109bd786e0f7be59fc3e780dab5"} Mar 14 06:05:25 crc kubenswrapper[4713]: I0314 06:05:25.837241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" event={"ID":"aca83833-8133-4064-b1b3-05989d69b5b0","Type":"ContainerStarted","Data":"8fd055004ded5cd85df7c55ebebb7ea9a05e766c94d1b078c917601e7ac14fd9"} Mar 14 06:05:25 crc kubenswrapper[4713]: I0314 06:05:25.862088 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" podStartSLOduration=2.160399672 podStartE2EDuration="2.862064958s" podCreationTimestamp="2026-03-14 06:05:23 +0000 UTC" firstStartedPulling="2026-03-14 06:05:24.782586067 +0000 UTC m=+2307.870495377" lastFinishedPulling="2026-03-14 06:05:25.484251363 +0000 UTC m=+2308.572160663" observedRunningTime="2026-03-14 06:05:25.850810785 +0000 UTC m=+2308.938720085" watchObservedRunningTime="2026-03-14 06:05:25.862064958 +0000 UTC m=+2308.949974258" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.245420 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.249230 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.260805 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.289465 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gj29\" (UniqueName: \"kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.289535 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.289953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.392860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gj29\" (UniqueName: \"kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.393334 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.393514 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.394372 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.395025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.414400 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gj29\" (UniqueName: \"kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29\") pod \"community-operators-j52jx\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:45 crc kubenswrapper[4713]: I0314 06:05:45.569634 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:05:46 crc kubenswrapper[4713]: I0314 06:05:46.065190 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ftqx"] Mar 14 06:05:46 crc kubenswrapper[4713]: I0314 06:05:46.089144 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ftqx"] Mar 14 06:05:46 crc kubenswrapper[4713]: I0314 06:05:46.189562 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:05:47 crc kubenswrapper[4713]: I0314 06:05:47.121218 4713 generic.go:334] "Generic (PLEG): container finished" podID="e7b927ae-84b8-4869-9783-b379c7e79611" containerID="042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1" exitCode=0 Mar 14 06:05:47 crc kubenswrapper[4713]: I0314 06:05:47.121345 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerDied","Data":"042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1"} Mar 14 06:05:47 crc kubenswrapper[4713]: I0314 06:05:47.121601 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerStarted","Data":"51d34aae3550b72d15b072fd54ac4cdad5adba09271a28bf8312472e6ca9d11b"} Mar 14 06:05:47 crc kubenswrapper[4713]: I0314 06:05:47.583182 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fedb6b32-29d1-46af-a86f-1d96ebb1406d" path="/var/lib/kubelet/pods/fedb6b32-29d1-46af-a86f-1d96ebb1406d/volumes" Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.146287 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerStarted","Data":"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026"} Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.651947 4713 scope.go:117] "RemoveContainer" containerID="b1d46683989bb8a0b72c9fb1594ba638639a58ed0ac497ed47cd3be3d800aa38" Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.692754 4713 scope.go:117] "RemoveContainer" containerID="b02a0c7449b8cc3ded2bbce1e906b44b18d8684e7dc3a27ea2d7930d20b8753a" Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.770378 4713 scope.go:117] "RemoveContainer" containerID="7fe333034490b3791cc5827ef673cde354dd7a9b075e0742fc0ec26b4191d3f8" Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.825374 4713 scope.go:117] "RemoveContainer" containerID="c32f927083e23fa1d7f4747055af85bb344d63c376f173f695f534ad96517aaf" Mar 14 06:05:49 crc kubenswrapper[4713]: I0314 06:05:49.926384 4713 scope.go:117] "RemoveContainer" containerID="a151e8bfc0f807624971a02bebfee38264656b9c81d9f9d16d7de62503783952" Mar 14 06:05:55 crc kubenswrapper[4713]: I0314 06:05:55.232089 4713 generic.go:334] "Generic (PLEG): container finished" podID="e7b927ae-84b8-4869-9783-b379c7e79611" containerID="d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026" exitCode=0 Mar 14 06:05:55 crc kubenswrapper[4713]: I0314 06:05:55.232699 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerDied","Data":"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026"} Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.244187 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerStarted","Data":"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef"} Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.269842 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j52jx" podStartSLOduration=2.7521238070000003 podStartE2EDuration="11.269815531s" podCreationTimestamp="2026-03-14 06:05:45 +0000 UTC" firstStartedPulling="2026-03-14 06:05:47.123411947 +0000 UTC m=+2330.211321247" lastFinishedPulling="2026-03-14 06:05:55.641103671 +0000 UTC m=+2338.729012971" observedRunningTime="2026-03-14 06:05:56.262227204 +0000 UTC m=+2339.350136504" watchObservedRunningTime="2026-03-14 06:05:56.269815531 +0000 UTC m=+2339.357724831" Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.848241 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.852888 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.861451 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.986859 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.987043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndbl\" (UniqueName: \"kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:56 crc kubenswrapper[4713]: I0314 06:05:56.987305 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.089792 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.089942 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndbl\" (UniqueName: \"kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.089992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.090524 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.090561 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.109055 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndbl\" (UniqueName: \"kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl\") pod \"certified-operators-7lt55\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.172087 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:05:57 crc kubenswrapper[4713]: I0314 06:05:57.669102 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:05:58 crc kubenswrapper[4713]: I0314 06:05:58.269919 4713 generic.go:334] "Generic (PLEG): container finished" podID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerID="4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3" exitCode=0 Mar 14 06:05:58 crc kubenswrapper[4713]: I0314 06:05:58.270025 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerDied","Data":"4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3"} Mar 14 06:05:58 crc kubenswrapper[4713]: I0314 06:05:58.270299 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerStarted","Data":"aec44f9d52895aa7b68c45567ea7d67eaf342dbfabd795aca972377a31e08e5a"} Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.147244 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557806-q87kz"] Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.149865 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.152832 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.152995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.153101 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.160810 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557806-q87kz"] Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.236464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwr4q\" (UniqueName: \"kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q\") pod \"auto-csr-approver-29557806-q87kz\" (UID: \"0eb67cc5-82d7-41b1-be5e-308a496a8944\") " pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.321515 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerStarted","Data":"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8"} Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.339003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwr4q\" (UniqueName: \"kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q\") pod \"auto-csr-approver-29557806-q87kz\" (UID: \"0eb67cc5-82d7-41b1-be5e-308a496a8944\") " pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.361833 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwr4q\" (UniqueName: \"kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q\") pod \"auto-csr-approver-29557806-q87kz\" (UID: \"0eb67cc5-82d7-41b1-be5e-308a496a8944\") " pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.483184 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:00 crc kubenswrapper[4713]: I0314 06:06:00.973633 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557806-q87kz"] Mar 14 06:06:01 crc kubenswrapper[4713]: I0314 06:06:01.336393 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557806-q87kz" event={"ID":"0eb67cc5-82d7-41b1-be5e-308a496a8944","Type":"ContainerStarted","Data":"97e6400dd41fc7bbdcc5a518c836c5b82b72bfd317863798805582fb7c707ad2"} Mar 14 06:06:05 crc kubenswrapper[4713]: I0314 06:06:05.577884 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:05 crc kubenswrapper[4713]: I0314 06:06:05.578411 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:06 crc kubenswrapper[4713]: I0314 06:06:06.625447 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-j52jx" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="registry-server" probeResult="failure" output=< Mar 14 06:06:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:06:06 crc kubenswrapper[4713]: > Mar 14 06:06:07 crc kubenswrapper[4713]: I0314 06:06:07.408631 4713 generic.go:334] "Generic (PLEG): container finished" podID="aca83833-8133-4064-b1b3-05989d69b5b0" containerID="8fd055004ded5cd85df7c55ebebb7ea9a05e766c94d1b078c917601e7ac14fd9" exitCode=0 Mar 14 06:06:07 crc kubenswrapper[4713]: I0314 06:06:07.408737 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" event={"ID":"aca83833-8133-4064-b1b3-05989d69b5b0","Type":"ContainerDied","Data":"8fd055004ded5cd85df7c55ebebb7ea9a05e766c94d1b078c917601e7ac14fd9"} Mar 14 06:06:08 crc kubenswrapper[4713]: I0314 06:06:08.424079 4713 generic.go:334] "Generic (PLEG): container finished" podID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerID="7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8" exitCode=0 Mar 14 06:06:08 crc kubenswrapper[4713]: I0314 06:06:08.424181 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerDied","Data":"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8"} Mar 14 06:06:08 crc kubenswrapper[4713]: I0314 06:06:08.428449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557806-q87kz" event={"ID":"0eb67cc5-82d7-41b1-be5e-308a496a8944","Type":"ContainerStarted","Data":"cab44d3e44aa2cb3c991dcd7f30964d4168812a6d2b8b4aec45586cf4963b566"} Mar 14 06:06:08 crc kubenswrapper[4713]: I0314 06:06:08.486967 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557806-q87kz" podStartSLOduration=2.551060944 podStartE2EDuration="8.48694357s" podCreationTimestamp="2026-03-14 06:06:00 +0000 UTC" firstStartedPulling="2026-03-14 06:06:00.961845983 +0000 UTC m=+2344.049755283" lastFinishedPulling="2026-03-14 06:06:06.897728609 +0000 UTC m=+2349.985637909" observedRunningTime="2026-03-14 06:06:08.471937329 +0000 UTC m=+2351.559846629" watchObservedRunningTime="2026-03-14 06:06:08.48694357 +0000 UTC m=+2351.574852870" Mar 14 06:06:08 crc kubenswrapper[4713]: I0314 06:06:08.942774 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.107413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xk2d\" (UniqueName: \"kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d\") pod \"aca83833-8133-4064-b1b3-05989d69b5b0\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.107540 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam\") pod \"aca83833-8133-4064-b1b3-05989d69b5b0\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.107626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory\") pod \"aca83833-8133-4064-b1b3-05989d69b5b0\" (UID: \"aca83833-8133-4064-b1b3-05989d69b5b0\") " Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.129979 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d" (OuterVolumeSpecName: "kube-api-access-8xk2d") pod "aca83833-8133-4064-b1b3-05989d69b5b0" (UID: "aca83833-8133-4064-b1b3-05989d69b5b0"). InnerVolumeSpecName "kube-api-access-8xk2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.182976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory" (OuterVolumeSpecName: "inventory") pod "aca83833-8133-4064-b1b3-05989d69b5b0" (UID: "aca83833-8133-4064-b1b3-05989d69b5b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.189305 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aca83833-8133-4064-b1b3-05989d69b5b0" (UID: "aca83833-8133-4064-b1b3-05989d69b5b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.214548 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xk2d\" (UniqueName: \"kubernetes.io/projected/aca83833-8133-4064-b1b3-05989d69b5b0-kube-api-access-8xk2d\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.214573 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.214582 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca83833-8133-4064-b1b3-05989d69b5b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.439879 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" event={"ID":"aca83833-8133-4064-b1b3-05989d69b5b0","Type":"ContainerDied","Data":"a1488cc51441d086b39d435b856a749539dbc109bd786e0f7be59fc3e780dab5"} Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.439911 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5mtz2" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.439940 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1488cc51441d086b39d435b856a749539dbc109bd786e0f7be59fc3e780dab5" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.527011 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp"] Mar 14 06:06:09 crc kubenswrapper[4713]: E0314 06:06:09.527623 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca83833-8133-4064-b1b3-05989d69b5b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.527645 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca83833-8133-4064-b1b3-05989d69b5b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.527867 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca83833-8133-4064-b1b3-05989d69b5b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.528777 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.531486 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.531732 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.531893 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.531993 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.542753 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp"] Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.625060 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.625461 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbnsg\" (UniqueName: \"kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.625518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.727866 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.728088 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbnsg\" (UniqueName: \"kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.728180 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.733363 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.739299 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.766251 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbnsg\" (UniqueName: \"kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pxglp\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:09 crc kubenswrapper[4713]: I0314 06:06:09.852658 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:06:10 crc kubenswrapper[4713]: I0314 06:06:10.461221 4713 generic.go:334] "Generic (PLEG): container finished" podID="0eb67cc5-82d7-41b1-be5e-308a496a8944" containerID="cab44d3e44aa2cb3c991dcd7f30964d4168812a6d2b8b4aec45586cf4963b566" exitCode=0 Mar 14 06:06:10 crc kubenswrapper[4713]: I0314 06:06:10.461244 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557806-q87kz" event={"ID":"0eb67cc5-82d7-41b1-be5e-308a496a8944","Type":"ContainerDied","Data":"cab44d3e44aa2cb3c991dcd7f30964d4168812a6d2b8b4aec45586cf4963b566"} Mar 14 06:06:10 crc kubenswrapper[4713]: I0314 06:06:10.467367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerStarted","Data":"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f"} Mar 14 06:06:10 crc kubenswrapper[4713]: I0314 06:06:10.500408 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp"] Mar 14 06:06:10 crc kubenswrapper[4713]: I0314 06:06:10.505833 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7lt55" podStartSLOduration=3.6953456 podStartE2EDuration="14.50581257s" podCreationTimestamp="2026-03-14 06:05:56 +0000 UTC" firstStartedPulling="2026-03-14 06:05:58.271976678 +0000 UTC m=+2341.359885978" lastFinishedPulling="2026-03-14 06:06:09.082443648 +0000 UTC m=+2352.170352948" observedRunningTime="2026-03-14 06:06:10.499549984 +0000 UTC m=+2353.587459274" watchObservedRunningTime="2026-03-14 06:06:10.50581257 +0000 UTC m=+2353.593721870" Mar 14 06:06:11 crc kubenswrapper[4713]: I0314 06:06:11.501538 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" event={"ID":"1158aa83-e4b7-4231-a249-ee99e7f4d291","Type":"ContainerStarted","Data":"3ff76d33549123deaada654711dfbca4842936ba9acb323fdd14f60f986fa35c"} Mar 14 06:06:11 crc kubenswrapper[4713]: I0314 06:06:11.502115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" event={"ID":"1158aa83-e4b7-4231-a249-ee99e7f4d291","Type":"ContainerStarted","Data":"15eaa45ef8e43be1bc443977d90725ef3728c36b8b5bb51d903009d6934165ab"} Mar 14 06:06:11 crc kubenswrapper[4713]: I0314 06:06:11.534993 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" podStartSLOduration=2.098741026 podStartE2EDuration="2.534969473s" podCreationTimestamp="2026-03-14 06:06:09 +0000 UTC" firstStartedPulling="2026-03-14 06:06:10.517897598 +0000 UTC m=+2353.605806908" lastFinishedPulling="2026-03-14 06:06:10.954126045 +0000 UTC m=+2354.042035355" observedRunningTime="2026-03-14 06:06:11.52527489 +0000 UTC m=+2354.613184190" watchObservedRunningTime="2026-03-14 06:06:11.534969473 +0000 UTC m=+2354.622878783" Mar 14 06:06:11 crc kubenswrapper[4713]: I0314 06:06:11.935544 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.008762 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwr4q\" (UniqueName: \"kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q\") pod \"0eb67cc5-82d7-41b1-be5e-308a496a8944\" (UID: \"0eb67cc5-82d7-41b1-be5e-308a496a8944\") " Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.017281 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q" (OuterVolumeSpecName: "kube-api-access-zwr4q") pod "0eb67cc5-82d7-41b1-be5e-308a496a8944" (UID: "0eb67cc5-82d7-41b1-be5e-308a496a8944"). InnerVolumeSpecName "kube-api-access-zwr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.113922 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwr4q\" (UniqueName: \"kubernetes.io/projected/0eb67cc5-82d7-41b1-be5e-308a496a8944-kube-api-access-zwr4q\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.530382 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557806-q87kz" Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.530431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557806-q87kz" event={"ID":"0eb67cc5-82d7-41b1-be5e-308a496a8944","Type":"ContainerDied","Data":"97e6400dd41fc7bbdcc5a518c836c5b82b72bfd317863798805582fb7c707ad2"} Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.530472 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e6400dd41fc7bbdcc5a518c836c5b82b72bfd317863798805582fb7c707ad2" Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.550746 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557800-6ppxr"] Mar 14 06:06:12 crc kubenswrapper[4713]: I0314 06:06:12.560428 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557800-6ppxr"] Mar 14 06:06:13 crc kubenswrapper[4713]: I0314 06:06:13.576705 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3dbe397-34ac-4c7f-990e-cec91d6592a4" path="/var/lib/kubelet/pods/b3dbe397-34ac-4c7f-990e-cec91d6592a4/volumes" Mar 14 06:06:15 crc kubenswrapper[4713]: I0314 06:06:15.627753 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:15 crc kubenswrapper[4713]: I0314 06:06:15.705550 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:16 crc kubenswrapper[4713]: I0314 06:06:16.444435 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:06:17 crc kubenswrapper[4713]: I0314 06:06:17.172727 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:17 crc kubenswrapper[4713]: I0314 06:06:17.172998 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:17 crc kubenswrapper[4713]: I0314 06:06:17.585915 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j52jx" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="registry-server" containerID="cri-o://84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef" gracePeriod=2 Mar 14 06:06:17 crc kubenswrapper[4713]: E0314 06:06:17.848118 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b927ae_84b8_4869_9783_b379c7e79611.slice/crio-conmon-84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b927ae_84b8_4869_9783_b379c7e79611.slice/crio-84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef.scope\": RecentStats: unable to find data in memory cache]" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.096229 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.235363 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7lt55" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="registry-server" probeResult="failure" output=< Mar 14 06:06:18 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:06:18 crc kubenswrapper[4713]: > Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.263165 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities\") pod \"e7b927ae-84b8-4869-9783-b379c7e79611\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.263245 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gj29\" (UniqueName: \"kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29\") pod \"e7b927ae-84b8-4869-9783-b379c7e79611\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.263291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content\") pod \"e7b927ae-84b8-4869-9783-b379c7e79611\" (UID: \"e7b927ae-84b8-4869-9783-b379c7e79611\") " Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.265559 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities" (OuterVolumeSpecName: "utilities") pod "e7b927ae-84b8-4869-9783-b379c7e79611" (UID: "e7b927ae-84b8-4869-9783-b379c7e79611"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.285053 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29" (OuterVolumeSpecName: "kube-api-access-9gj29") pod "e7b927ae-84b8-4869-9783-b379c7e79611" (UID: "e7b927ae-84b8-4869-9783-b379c7e79611"). InnerVolumeSpecName "kube-api-access-9gj29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.330926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7b927ae-84b8-4869-9783-b379c7e79611" (UID: "e7b927ae-84b8-4869-9783-b379c7e79611"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.366544 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.366596 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gj29\" (UniqueName: \"kubernetes.io/projected/e7b927ae-84b8-4869-9783-b379c7e79611-kube-api-access-9gj29\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.366607 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7b927ae-84b8-4869-9783-b379c7e79611-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.597414 4713 generic.go:334] "Generic (PLEG): container finished" podID="e7b927ae-84b8-4869-9783-b379c7e79611" containerID="84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef" exitCode=0 Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.597460 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerDied","Data":"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef"} Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.597472 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j52jx" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.597494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j52jx" event={"ID":"e7b927ae-84b8-4869-9783-b379c7e79611","Type":"ContainerDied","Data":"51d34aae3550b72d15b072fd54ac4cdad5adba09271a28bf8312472e6ca9d11b"} Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.597516 4713 scope.go:117] "RemoveContainer" containerID="84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.631769 4713 scope.go:117] "RemoveContainer" containerID="d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.636734 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.647028 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j52jx"] Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.682830 4713 scope.go:117] "RemoveContainer" containerID="042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.723337 4713 scope.go:117] "RemoveContainer" containerID="84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef" Mar 14 06:06:18 crc kubenswrapper[4713]: E0314 06:06:18.723942 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef\": container with ID starting with 84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef not found: ID does not exist" containerID="84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.723997 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef"} err="failed to get container status \"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef\": rpc error: code = NotFound desc = could not find container \"84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef\": container with ID starting with 84a4bbd2d3f477306df977656a9057220c7b6a3bc506e57b359e2b47f3bca1ef not found: ID does not exist" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.724027 4713 scope.go:117] "RemoveContainer" containerID="d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026" Mar 14 06:06:18 crc kubenswrapper[4713]: E0314 06:06:18.724495 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026\": container with ID starting with d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026 not found: ID does not exist" containerID="d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.724548 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026"} err="failed to get container status \"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026\": rpc error: code = NotFound desc = could not find container \"d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026\": container with ID starting with d59b8a0c1c3756f74c0b3855cc68b83fe1f78f8e2a906e19ecd2281f1fb28026 not found: ID does not exist" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.724579 4713 scope.go:117] "RemoveContainer" containerID="042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1" Mar 14 06:06:18 crc kubenswrapper[4713]: E0314 06:06:18.724913 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1\": container with ID starting with 042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1 not found: ID does not exist" containerID="042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1" Mar 14 06:06:18 crc kubenswrapper[4713]: I0314 06:06:18.724948 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1"} err="failed to get container status \"042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1\": rpc error: code = NotFound desc = could not find container \"042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1\": container with ID starting with 042cc00cb769b3bae65e24e4abc35d40cdd88ada5eac2af11ca62ddacd04e8f1 not found: ID does not exist" Mar 14 06:06:19 crc kubenswrapper[4713]: I0314 06:06:19.580078 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" path="/var/lib/kubelet/pods/e7b927ae-84b8-4869-9783-b379c7e79611/volumes" Mar 14 06:06:27 crc kubenswrapper[4713]: I0314 06:06:27.222773 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:27 crc kubenswrapper[4713]: I0314 06:06:27.271884 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:28 crc kubenswrapper[4713]: I0314 06:06:28.052830 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:06:28 crc kubenswrapper[4713]: I0314 06:06:28.709508 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7lt55" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="registry-server" containerID="cri-o://b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f" gracePeriod=2 Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.276143 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.443023 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content\") pod \"631d005b-56f9-43e3-8eb9-f37554b25ebe\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.443171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dndbl\" (UniqueName: \"kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl\") pod \"631d005b-56f9-43e3-8eb9-f37554b25ebe\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.443510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities\") pod \"631d005b-56f9-43e3-8eb9-f37554b25ebe\" (UID: \"631d005b-56f9-43e3-8eb9-f37554b25ebe\") " Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.444372 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities" (OuterVolumeSpecName: "utilities") pod "631d005b-56f9-43e3-8eb9-f37554b25ebe" (UID: "631d005b-56f9-43e3-8eb9-f37554b25ebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.444723 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.449839 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl" (OuterVolumeSpecName: "kube-api-access-dndbl") pod "631d005b-56f9-43e3-8eb9-f37554b25ebe" (UID: "631d005b-56f9-43e3-8eb9-f37554b25ebe"). InnerVolumeSpecName "kube-api-access-dndbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.492621 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "631d005b-56f9-43e3-8eb9-f37554b25ebe" (UID: "631d005b-56f9-43e3-8eb9-f37554b25ebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.547011 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631d005b-56f9-43e3-8eb9-f37554b25ebe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.547041 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dndbl\" (UniqueName: \"kubernetes.io/projected/631d005b-56f9-43e3-8eb9-f37554b25ebe-kube-api-access-dndbl\") on node \"crc\" DevicePath \"\"" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.728787 4713 generic.go:334] "Generic (PLEG): container finished" podID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerID="b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f" exitCode=0 Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.728834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerDied","Data":"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f"} Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.728865 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lt55" event={"ID":"631d005b-56f9-43e3-8eb9-f37554b25ebe","Type":"ContainerDied","Data":"aec44f9d52895aa7b68c45567ea7d67eaf342dbfabd795aca972377a31e08e5a"} Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.728881 4713 scope.go:117] "RemoveContainer" containerID="b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.729017 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lt55" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.763747 4713 scope.go:117] "RemoveContainer" containerID="7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.763816 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.773320 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7lt55"] Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.789118 4713 scope.go:117] "RemoveContainer" containerID="4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.857988 4713 scope.go:117] "RemoveContainer" containerID="b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f" Mar 14 06:06:29 crc kubenswrapper[4713]: E0314 06:06:29.858550 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f\": container with ID starting with b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f not found: ID does not exist" containerID="b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.858637 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f"} err="failed to get container status \"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f\": rpc error: code = NotFound desc = could not find container \"b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f\": container with ID starting with b4059e12366f0ebdedb3a1c9a13d209288ee2d8b062b3b96052709c8c982b05f not found: ID does not exist" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.858672 4713 scope.go:117] "RemoveContainer" containerID="7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8" Mar 14 06:06:29 crc kubenswrapper[4713]: E0314 06:06:29.859195 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8\": container with ID starting with 7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8 not found: ID does not exist" containerID="7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.859287 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8"} err="failed to get container status \"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8\": rpc error: code = NotFound desc = could not find container \"7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8\": container with ID starting with 7b818655e53ff37045b4d311ecb398ab2c2e39a22f23003e9140c6a3b07c10b8 not found: ID does not exist" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.859333 4713 scope.go:117] "RemoveContainer" containerID="4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3" Mar 14 06:06:29 crc kubenswrapper[4713]: E0314 06:06:29.859869 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3\": container with ID starting with 4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3 not found: ID does not exist" containerID="4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3" Mar 14 06:06:29 crc kubenswrapper[4713]: I0314 06:06:29.859894 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3"} err="failed to get container status \"4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3\": rpc error: code = NotFound desc = could not find container \"4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3\": container with ID starting with 4da9831f8863da3df66b955d8eb01522899e140f71afec2610970c3dcd5a35a3 not found: ID does not exist" Mar 14 06:06:31 crc kubenswrapper[4713]: I0314 06:06:31.579796 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" path="/var/lib/kubelet/pods/631d005b-56f9-43e3-8eb9-f37554b25ebe/volumes" Mar 14 06:06:40 crc kubenswrapper[4713]: I0314 06:06:40.731267 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:06:40 crc kubenswrapper[4713]: I0314 06:06:40.731843 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:06:50 crc kubenswrapper[4713]: I0314 06:06:50.194314 4713 scope.go:117] "RemoveContainer" containerID="24c1d9107c0fea6a541652c613b17559b142ac7f709d4a2c23933ff79704a398" Mar 14 06:07:00 crc kubenswrapper[4713]: I0314 06:07:00.050183 4713 generic.go:334] "Generic (PLEG): container finished" podID="1158aa83-e4b7-4231-a249-ee99e7f4d291" containerID="3ff76d33549123deaada654711dfbca4842936ba9acb323fdd14f60f986fa35c" exitCode=0 Mar 14 06:07:00 crc kubenswrapper[4713]: I0314 06:07:00.050334 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" event={"ID":"1158aa83-e4b7-4231-a249-ee99e7f4d291","Type":"ContainerDied","Data":"3ff76d33549123deaada654711dfbca4842936ba9acb323fdd14f60f986fa35c"} Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.627022 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.740890 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory\") pod \"1158aa83-e4b7-4231-a249-ee99e7f4d291\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.741895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam\") pod \"1158aa83-e4b7-4231-a249-ee99e7f4d291\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.741953 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbnsg\" (UniqueName: \"kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg\") pod \"1158aa83-e4b7-4231-a249-ee99e7f4d291\" (UID: \"1158aa83-e4b7-4231-a249-ee99e7f4d291\") " Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.748361 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg" (OuterVolumeSpecName: "kube-api-access-zbnsg") pod "1158aa83-e4b7-4231-a249-ee99e7f4d291" (UID: "1158aa83-e4b7-4231-a249-ee99e7f4d291"). InnerVolumeSpecName "kube-api-access-zbnsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.772538 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory" (OuterVolumeSpecName: "inventory") pod "1158aa83-e4b7-4231-a249-ee99e7f4d291" (UID: "1158aa83-e4b7-4231-a249-ee99e7f4d291"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.782306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1158aa83-e4b7-4231-a249-ee99e7f4d291" (UID: "1158aa83-e4b7-4231-a249-ee99e7f4d291"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.845307 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.845355 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbnsg\" (UniqueName: \"kubernetes.io/projected/1158aa83-e4b7-4231-a249-ee99e7f4d291-kube-api-access-zbnsg\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:01 crc kubenswrapper[4713]: I0314 06:07:01.845375 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1158aa83-e4b7-4231-a249-ee99e7f4d291-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.071802 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" event={"ID":"1158aa83-e4b7-4231-a249-ee99e7f4d291","Type":"ContainerDied","Data":"15eaa45ef8e43be1bc443977d90725ef3728c36b8b5bb51d903009d6934165ab"} Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.071842 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15eaa45ef8e43be1bc443977d90725ef3728c36b8b5bb51d903009d6934165ab" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.071876 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pxglp" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.166549 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1158aa83_e4b7_4231_a249_ee99e7f4d291.slice/crio-15eaa45ef8e43be1bc443977d90725ef3728c36b8b5bb51d903009d6934165ab\": RecentStats: unable to find data in memory cache]" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.168360 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8xdw"] Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.168927 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="extract-utilities" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.168948 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="extract-utilities" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.168972 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1158aa83-e4b7-4231-a249-ee99e7f4d291" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.168981 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1158aa83-e4b7-4231-a249-ee99e7f4d291" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.168994 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="extract-content" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169002 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="extract-content" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.169010 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="extract-utilities" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169017 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="extract-utilities" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.169058 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169066 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.169084 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169092 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.169103 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="extract-content" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169111 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="extract-content" Mar 14 06:07:02 crc kubenswrapper[4713]: E0314 06:07:02.169148 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb67cc5-82d7-41b1-be5e-308a496a8944" containerName="oc" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169155 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb67cc5-82d7-41b1-be5e-308a496a8944" containerName="oc" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169400 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1158aa83-e4b7-4231-a249-ee99e7f4d291" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169418 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b927ae-84b8-4869-9783-b379c7e79611" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169433 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="631d005b-56f9-43e3-8eb9-f37554b25ebe" containerName="registry-server" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.169447 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb67cc5-82d7-41b1-be5e-308a496a8944" containerName="oc" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.170290 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.172136 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.173923 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.173929 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.174230 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.181653 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8xdw"] Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.253497 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.253658 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.253684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpz4v\" (UniqueName: \"kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.355677 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.356195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.356335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpz4v\" (UniqueName: \"kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.371923 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.372437 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.387676 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpz4v\" (UniqueName: \"kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v\") pod \"ssh-known-hosts-edpm-deployment-j8xdw\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:02 crc kubenswrapper[4713]: I0314 06:07:02.490251 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:03 crc kubenswrapper[4713]: I0314 06:07:03.055831 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8xdw"] Mar 14 06:07:03 crc kubenswrapper[4713]: I0314 06:07:03.088002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" event={"ID":"cf1997f7-a846-45ea-bc72-6db299d42afe","Type":"ContainerStarted","Data":"cc9bc469f8d34312c189dd37bc482674b1d81429d229915190c52d93aac44717"} Mar 14 06:07:04 crc kubenswrapper[4713]: I0314 06:07:04.102998 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" event={"ID":"cf1997f7-a846-45ea-bc72-6db299d42afe","Type":"ContainerStarted","Data":"5fa172ba7f723f003202dac1f6460e803d3fb553236ec46161789c5b8686f2fc"} Mar 14 06:07:04 crc kubenswrapper[4713]: I0314 06:07:04.134153 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" podStartSLOduration=1.5293806939999999 podStartE2EDuration="2.134118302s" podCreationTimestamp="2026-03-14 06:07:02 +0000 UTC" firstStartedPulling="2026-03-14 06:07:03.059517724 +0000 UTC m=+2406.147427024" lastFinishedPulling="2026-03-14 06:07:03.664255332 +0000 UTC m=+2406.752164632" observedRunningTime="2026-03-14 06:07:04.117962456 +0000 UTC m=+2407.205871756" watchObservedRunningTime="2026-03-14 06:07:04.134118302 +0000 UTC m=+2407.222027592" Mar 14 06:07:10 crc kubenswrapper[4713]: I0314 06:07:10.731610 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:07:10 crc kubenswrapper[4713]: I0314 06:07:10.732045 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:07:11 crc kubenswrapper[4713]: I0314 06:07:11.191074 4713 generic.go:334] "Generic (PLEG): container finished" podID="cf1997f7-a846-45ea-bc72-6db299d42afe" containerID="5fa172ba7f723f003202dac1f6460e803d3fb553236ec46161789c5b8686f2fc" exitCode=0 Mar 14 06:07:11 crc kubenswrapper[4713]: I0314 06:07:11.191119 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" event={"ID":"cf1997f7-a846-45ea-bc72-6db299d42afe","Type":"ContainerDied","Data":"5fa172ba7f723f003202dac1f6460e803d3fb553236ec46161789c5b8686f2fc"} Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.701144 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.740369 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0\") pod \"cf1997f7-a846-45ea-bc72-6db299d42afe\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.740428 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam\") pod \"cf1997f7-a846-45ea-bc72-6db299d42afe\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.740532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpz4v\" (UniqueName: \"kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v\") pod \"cf1997f7-a846-45ea-bc72-6db299d42afe\" (UID: \"cf1997f7-a846-45ea-bc72-6db299d42afe\") " Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.746468 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v" (OuterVolumeSpecName: "kube-api-access-bpz4v") pod "cf1997f7-a846-45ea-bc72-6db299d42afe" (UID: "cf1997f7-a846-45ea-bc72-6db299d42afe"). InnerVolumeSpecName "kube-api-access-bpz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.778578 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf1997f7-a846-45ea-bc72-6db299d42afe" (UID: "cf1997f7-a846-45ea-bc72-6db299d42afe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.781226 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cf1997f7-a846-45ea-bc72-6db299d42afe" (UID: "cf1997f7-a846-45ea-bc72-6db299d42afe"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.844315 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpz4v\" (UniqueName: \"kubernetes.io/projected/cf1997f7-a846-45ea-bc72-6db299d42afe-kube-api-access-bpz4v\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.844358 4713 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:12 crc kubenswrapper[4713]: I0314 06:07:12.844370 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf1997f7-a846-45ea-bc72-6db299d42afe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.215422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" event={"ID":"cf1997f7-a846-45ea-bc72-6db299d42afe","Type":"ContainerDied","Data":"cc9bc469f8d34312c189dd37bc482674b1d81429d229915190c52d93aac44717"} Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.215773 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc9bc469f8d34312c189dd37bc482674b1d81429d229915190c52d93aac44717" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.215507 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8xdw" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.297455 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz"] Mar 14 06:07:13 crc kubenswrapper[4713]: E0314 06:07:13.298057 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1997f7-a846-45ea-bc72-6db299d42afe" containerName="ssh-known-hosts-edpm-deployment" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.298078 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1997f7-a846-45ea-bc72-6db299d42afe" containerName="ssh-known-hosts-edpm-deployment" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.311363 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1997f7-a846-45ea-bc72-6db299d42afe" containerName="ssh-known-hosts-edpm-deployment" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.312659 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.314627 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.315051 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.315230 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.315326 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.326048 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz"] Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.355928 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.356025 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.356157 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx2qd\" (UniqueName: \"kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.458254 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.458399 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx2qd\" (UniqueName: \"kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.458539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.462253 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.462421 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.475574 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx2qd\" (UniqueName: \"kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xnxz\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:13 crc kubenswrapper[4713]: I0314 06:07:13.640001 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:14 crc kubenswrapper[4713]: I0314 06:07:14.168124 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz"] Mar 14 06:07:14 crc kubenswrapper[4713]: I0314 06:07:14.181576 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:07:14 crc kubenswrapper[4713]: I0314 06:07:14.233808 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" event={"ID":"1b5e5638-e71f-47c0-a136-7530a65e7053","Type":"ContainerStarted","Data":"1910a2be2b2dfdca3180bc4f109352611704d08690c057f18df6b261bf3832e6"} Mar 14 06:07:15 crc kubenswrapper[4713]: I0314 06:07:15.249387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" event={"ID":"1b5e5638-e71f-47c0-a136-7530a65e7053","Type":"ContainerStarted","Data":"23d861a979a1f9cef66b4970764207fe0c6c6338ee2a0f63fffc5b91de03a3c1"} Mar 14 06:07:15 crc kubenswrapper[4713]: I0314 06:07:15.268399 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" podStartSLOduration=1.860843278 podStartE2EDuration="2.268380674s" podCreationTimestamp="2026-03-14 06:07:13 +0000 UTC" firstStartedPulling="2026-03-14 06:07:14.181365026 +0000 UTC m=+2417.269274326" lastFinishedPulling="2026-03-14 06:07:14.588902422 +0000 UTC m=+2417.676811722" observedRunningTime="2026-03-14 06:07:15.262938673 +0000 UTC m=+2418.350848033" watchObservedRunningTime="2026-03-14 06:07:15.268380674 +0000 UTC m=+2418.356289974" Mar 14 06:07:23 crc kubenswrapper[4713]: I0314 06:07:23.348492 4713 generic.go:334] "Generic (PLEG): container finished" podID="1b5e5638-e71f-47c0-a136-7530a65e7053" containerID="23d861a979a1f9cef66b4970764207fe0c6c6338ee2a0f63fffc5b91de03a3c1" exitCode=0 Mar 14 06:07:23 crc kubenswrapper[4713]: I0314 06:07:23.348595 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" event={"ID":"1b5e5638-e71f-47c0-a136-7530a65e7053","Type":"ContainerDied","Data":"23d861a979a1f9cef66b4970764207fe0c6c6338ee2a0f63fffc5b91de03a3c1"} Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:24.999593 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.062525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory\") pod \"1b5e5638-e71f-47c0-a136-7530a65e7053\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.062626 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam\") pod \"1b5e5638-e71f-47c0-a136-7530a65e7053\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.062767 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx2qd\" (UniqueName: \"kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd\") pod \"1b5e5638-e71f-47c0-a136-7530a65e7053\" (UID: \"1b5e5638-e71f-47c0-a136-7530a65e7053\") " Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.068998 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd" (OuterVolumeSpecName: "kube-api-access-vx2qd") pod "1b5e5638-e71f-47c0-a136-7530a65e7053" (UID: "1b5e5638-e71f-47c0-a136-7530a65e7053"). InnerVolumeSpecName "kube-api-access-vx2qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.100543 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1b5e5638-e71f-47c0-a136-7530a65e7053" (UID: "1b5e5638-e71f-47c0-a136-7530a65e7053"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.122811 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory" (OuterVolumeSpecName: "inventory") pod "1b5e5638-e71f-47c0-a136-7530a65e7053" (UID: "1b5e5638-e71f-47c0-a136-7530a65e7053"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.166447 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.166491 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b5e5638-e71f-47c0-a136-7530a65e7053-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.166507 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx2qd\" (UniqueName: \"kubernetes.io/projected/1b5e5638-e71f-47c0-a136-7530a65e7053-kube-api-access-vx2qd\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.370910 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" event={"ID":"1b5e5638-e71f-47c0-a136-7530a65e7053","Type":"ContainerDied","Data":"1910a2be2b2dfdca3180bc4f109352611704d08690c057f18df6b261bf3832e6"} Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.370954 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1910a2be2b2dfdca3180bc4f109352611704d08690c057f18df6b261bf3832e6" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.370966 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xnxz" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.458243 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs"] Mar 14 06:07:25 crc kubenswrapper[4713]: E0314 06:07:25.458739 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5e5638-e71f-47c0-a136-7530a65e7053" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.458758 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5e5638-e71f-47c0-a136-7530a65e7053" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.459100 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5e5638-e71f-47c0-a136-7530a65e7053" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.460047 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.462256 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.463371 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.465795 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.466744 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.473975 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs"] Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.575057 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkt8z\" (UniqueName: \"kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.575370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.575397 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.677510 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.677555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.677708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkt8z\" (UniqueName: \"kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.684274 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.684599 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.706537 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkt8z\" (UniqueName: \"kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-476vs\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:25 crc kubenswrapper[4713]: I0314 06:07:25.780949 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:26 crc kubenswrapper[4713]: I0314 06:07:26.353148 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs"] Mar 14 06:07:26 crc kubenswrapper[4713]: I0314 06:07:26.382139 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" event={"ID":"b8311ebd-6052-4bcf-98a4-15cde102418b","Type":"ContainerStarted","Data":"528385132a696c28664d6d3df32db852def28c1cd0a57442d3e8a1cb24493d86"} Mar 14 06:07:27 crc kubenswrapper[4713]: I0314 06:07:27.398457 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" event={"ID":"b8311ebd-6052-4bcf-98a4-15cde102418b","Type":"ContainerStarted","Data":"2ff7a7de53bf32ff74c293004fdb31aad99944938ff017f21e2933faeb173a38"} Mar 14 06:07:27 crc kubenswrapper[4713]: I0314 06:07:27.420772 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" podStartSLOduration=1.982936786 podStartE2EDuration="2.420753282s" podCreationTimestamp="2026-03-14 06:07:25 +0000 UTC" firstStartedPulling="2026-03-14 06:07:26.353785133 +0000 UTC m=+2429.441694433" lastFinishedPulling="2026-03-14 06:07:26.791601629 +0000 UTC m=+2429.879510929" observedRunningTime="2026-03-14 06:07:27.414170405 +0000 UTC m=+2430.502079705" watchObservedRunningTime="2026-03-14 06:07:27.420753282 +0000 UTC m=+2430.508662582" Mar 14 06:07:37 crc kubenswrapper[4713]: I0314 06:07:37.502126 4713 generic.go:334] "Generic (PLEG): container finished" podID="b8311ebd-6052-4bcf-98a4-15cde102418b" containerID="2ff7a7de53bf32ff74c293004fdb31aad99944938ff017f21e2933faeb173a38" exitCode=0 Mar 14 06:07:37 crc kubenswrapper[4713]: I0314 06:07:37.502284 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" event={"ID":"b8311ebd-6052-4bcf-98a4-15cde102418b","Type":"ContainerDied","Data":"2ff7a7de53bf32ff74c293004fdb31aad99944938ff017f21e2933faeb173a38"} Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.287445 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.334688 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory\") pod \"b8311ebd-6052-4bcf-98a4-15cde102418b\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.334852 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam\") pod \"b8311ebd-6052-4bcf-98a4-15cde102418b\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.334898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkt8z\" (UniqueName: \"kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z\") pod \"b8311ebd-6052-4bcf-98a4-15cde102418b\" (UID: \"b8311ebd-6052-4bcf-98a4-15cde102418b\") " Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.347808 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z" (OuterVolumeSpecName: "kube-api-access-kkt8z") pod "b8311ebd-6052-4bcf-98a4-15cde102418b" (UID: "b8311ebd-6052-4bcf-98a4-15cde102418b"). InnerVolumeSpecName "kube-api-access-kkt8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.367954 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory" (OuterVolumeSpecName: "inventory") pod "b8311ebd-6052-4bcf-98a4-15cde102418b" (UID: "b8311ebd-6052-4bcf-98a4-15cde102418b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.380300 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b8311ebd-6052-4bcf-98a4-15cde102418b" (UID: "b8311ebd-6052-4bcf-98a4-15cde102418b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.440067 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.440143 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8311ebd-6052-4bcf-98a4-15cde102418b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.440158 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkt8z\" (UniqueName: \"kubernetes.io/projected/b8311ebd-6052-4bcf-98a4-15cde102418b-kube-api-access-kkt8z\") on node \"crc\" DevicePath \"\"" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.522177 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" event={"ID":"b8311ebd-6052-4bcf-98a4-15cde102418b","Type":"ContainerDied","Data":"528385132a696c28664d6d3df32db852def28c1cd0a57442d3e8a1cb24493d86"} Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.522264 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528385132a696c28664d6d3df32db852def28c1cd0a57442d3e8a1cb24493d86" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.522269 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-476vs" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.673263 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms"] Mar 14 06:07:39 crc kubenswrapper[4713]: E0314 06:07:39.673949 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8311ebd-6052-4bcf-98a4-15cde102418b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.673974 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8311ebd-6052-4bcf-98a4-15cde102418b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.674247 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8311ebd-6052-4bcf-98a4-15cde102418b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.675257 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.677902 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.677975 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.677999 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.678004 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.678012 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.678262 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.679466 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.679679 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.679882 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.691676 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms"] Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.747993 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rt96\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748185 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748359 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748448 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748686 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748860 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.748910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749039 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749078 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749135 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749171 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749237 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.749291 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852042 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852118 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852243 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852283 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852309 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rt96\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852492 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852528 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852563 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852594 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852629 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852756 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852841 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.852874 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.883943 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.887861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.896909 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.897682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.897702 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.898541 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.901974 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.902114 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.902877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.907184 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.907793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.909066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.910845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.916675 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.916980 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:39 crc kubenswrapper[4713]: I0314 06:07:39.918407 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rt96\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fflms\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.033457 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.593897 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms"] Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.731474 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.731534 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.731686 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.732783 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:07:40 crc kubenswrapper[4713]: I0314 06:07:40.732841 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" gracePeriod=600 Mar 14 06:07:40 crc kubenswrapper[4713]: E0314 06:07:40.858006 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.547073 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" exitCode=0 Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.547136 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df"} Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.547215 4713 scope.go:117] "RemoveContainer" containerID="9d66e0160c929053a77359b172aa48e22841072ca35af3852080f63092daa147" Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.548327 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:07:41 crc kubenswrapper[4713]: E0314 06:07:41.548830 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.551389 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" event={"ID":"9ff839a7-6783-4526-a69b-66def7b3f8b4","Type":"ContainerStarted","Data":"f36f1b5967ad5075b2ed595de43c66a2f1830806b79b981148cc2f88fb46d26d"} Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.551665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" event={"ID":"9ff839a7-6783-4526-a69b-66def7b3f8b4","Type":"ContainerStarted","Data":"9fc08c536914a08e715ce30fde970a71fdc61b689cf5806d5f834b95f0b4a586"} Mar 14 06:07:41 crc kubenswrapper[4713]: I0314 06:07:41.605429 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" podStartSLOduration=2.175999751 podStartE2EDuration="2.605394161s" podCreationTimestamp="2026-03-14 06:07:39 +0000 UTC" firstStartedPulling="2026-03-14 06:07:40.610395829 +0000 UTC m=+2443.698305129" lastFinishedPulling="2026-03-14 06:07:41.039790239 +0000 UTC m=+2444.127699539" observedRunningTime="2026-03-14 06:07:41.597415491 +0000 UTC m=+2444.685324791" watchObservedRunningTime="2026-03-14 06:07:41.605394161 +0000 UTC m=+2444.693303461" Mar 14 06:07:46 crc kubenswrapper[4713]: I0314 06:07:46.041849 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-cv4bn"] Mar 14 06:07:46 crc kubenswrapper[4713]: I0314 06:07:46.055986 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-cv4bn"] Mar 14 06:07:47 crc kubenswrapper[4713]: I0314 06:07:47.575580 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0230bf57-ad22-415a-bc91-2269773f8097" path="/var/lib/kubelet/pods/0230bf57-ad22-415a-bc91-2269773f8097/volumes" Mar 14 06:07:50 crc kubenswrapper[4713]: I0314 06:07:50.308577 4713 scope.go:117] "RemoveContainer" containerID="a42f03a318dba23533f4c912f47f8130c181c5cc6bdaa48dc08d536336be4c8e" Mar 14 06:07:56 crc kubenswrapper[4713]: I0314 06:07:56.564096 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:07:56 crc kubenswrapper[4713]: E0314 06:07:56.564890 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.145195 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557808-mfw6r"] Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.147283 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.151041 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.151298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.151419 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.157851 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557808-mfw6r"] Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.236704 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8st\" (UniqueName: \"kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st\") pod \"auto-csr-approver-29557808-mfw6r\" (UID: \"592b0324-5eb1-47fe-ba55-b8f2e4ca924a\") " pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.338323 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8st\" (UniqueName: \"kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st\") pod \"auto-csr-approver-29557808-mfw6r\" (UID: \"592b0324-5eb1-47fe-ba55-b8f2e4ca924a\") " pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.355288 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8st\" (UniqueName: \"kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st\") pod \"auto-csr-approver-29557808-mfw6r\" (UID: \"592b0324-5eb1-47fe-ba55-b8f2e4ca924a\") " pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.467380 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:00 crc kubenswrapper[4713]: I0314 06:08:00.919201 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557808-mfw6r"] Mar 14 06:08:00 crc kubenswrapper[4713]: W0314 06:08:00.920354 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592b0324_5eb1_47fe_ba55_b8f2e4ca924a.slice/crio-f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f WatchSource:0}: Error finding container f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f: Status 404 returned error can't find the container with id f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f Mar 14 06:08:01 crc kubenswrapper[4713]: I0314 06:08:01.762919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" event={"ID":"592b0324-5eb1-47fe-ba55-b8f2e4ca924a","Type":"ContainerStarted","Data":"f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f"} Mar 14 06:08:02 crc kubenswrapper[4713]: E0314 06:08:02.661679 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592b0324_5eb1_47fe_ba55_b8f2e4ca924a.slice/crio-conmon-94a46d1f8713447717f606fbe3485991f7bdc44a979086a93820f84856babd29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592b0324_5eb1_47fe_ba55_b8f2e4ca924a.slice/crio-94a46d1f8713447717f606fbe3485991f7bdc44a979086a93820f84856babd29.scope\": RecentStats: unable to find data in memory cache]" Mar 14 06:08:02 crc kubenswrapper[4713]: I0314 06:08:02.776460 4713 generic.go:334] "Generic (PLEG): container finished" podID="592b0324-5eb1-47fe-ba55-b8f2e4ca924a" containerID="94a46d1f8713447717f606fbe3485991f7bdc44a979086a93820f84856babd29" exitCode=0 Mar 14 06:08:02 crc kubenswrapper[4713]: I0314 06:08:02.776527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" event={"ID":"592b0324-5eb1-47fe-ba55-b8f2e4ca924a","Type":"ContainerDied","Data":"94a46d1f8713447717f606fbe3485991f7bdc44a979086a93820f84856babd29"} Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.247413 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.443587 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq8st\" (UniqueName: \"kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st\") pod \"592b0324-5eb1-47fe-ba55-b8f2e4ca924a\" (UID: \"592b0324-5eb1-47fe-ba55-b8f2e4ca924a\") " Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.450489 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st" (OuterVolumeSpecName: "kube-api-access-gq8st") pod "592b0324-5eb1-47fe-ba55-b8f2e4ca924a" (UID: "592b0324-5eb1-47fe-ba55-b8f2e4ca924a"). InnerVolumeSpecName "kube-api-access-gq8st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.547300 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq8st\" (UniqueName: \"kubernetes.io/projected/592b0324-5eb1-47fe-ba55-b8f2e4ca924a-kube-api-access-gq8st\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.800852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" event={"ID":"592b0324-5eb1-47fe-ba55-b8f2e4ca924a","Type":"ContainerDied","Data":"f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f"} Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.801170 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f10816b444c475c9a031580962dbe6d298e925f729c01b9ff05ea2a92b180b8f" Mar 14 06:08:04 crc kubenswrapper[4713]: I0314 06:08:04.800911 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557808-mfw6r" Mar 14 06:08:05 crc kubenswrapper[4713]: I0314 06:08:05.322243 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557802-8lwbs"] Mar 14 06:08:05 crc kubenswrapper[4713]: I0314 06:08:05.335044 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557802-8lwbs"] Mar 14 06:08:05 crc kubenswrapper[4713]: I0314 06:08:05.577364 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8806b4-7007-42b9-b55b-2392aab57894" path="/var/lib/kubelet/pods/2a8806b4-7007-42b9-b55b-2392aab57894/volumes" Mar 14 06:08:08 crc kubenswrapper[4713]: I0314 06:08:08.563968 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:08:08 crc kubenswrapper[4713]: E0314 06:08:08.564506 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:08:19 crc kubenswrapper[4713]: I0314 06:08:19.564133 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:08:19 crc kubenswrapper[4713]: E0314 06:08:19.564961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:08:23 crc kubenswrapper[4713]: I0314 06:08:23.049690 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-z9m2b"] Mar 14 06:08:23 crc kubenswrapper[4713]: I0314 06:08:23.061100 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-z9m2b"] Mar 14 06:08:23 crc kubenswrapper[4713]: I0314 06:08:23.591517 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4" path="/var/lib/kubelet/pods/e09a1a5f-2ab6-4f6f-ab5e-f608e2e43bb4/volumes" Mar 14 06:08:24 crc kubenswrapper[4713]: I0314 06:08:24.036549 4713 generic.go:334] "Generic (PLEG): container finished" podID="9ff839a7-6783-4526-a69b-66def7b3f8b4" containerID="f36f1b5967ad5075b2ed595de43c66a2f1830806b79b981148cc2f88fb46d26d" exitCode=0 Mar 14 06:08:24 crc kubenswrapper[4713]: I0314 06:08:24.036721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" event={"ID":"9ff839a7-6783-4526-a69b-66def7b3f8b4","Type":"ContainerDied","Data":"f36f1b5967ad5075b2ed595de43c66a2f1830806b79b981148cc2f88fb46d26d"} Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.525134 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636157 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636492 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636545 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636605 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rt96\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636621 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636667 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636733 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636811 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636837 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636866 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636902 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636943 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.636989 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.637021 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.637128 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.637173 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle\") pod \"9ff839a7-6783-4526-a69b-66def7b3f8b4\" (UID: \"9ff839a7-6783-4526-a69b-66def7b3f8b4\") " Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.643493 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.643789 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.644217 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96" (OuterVolumeSpecName: "kube-api-access-6rt96") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "kube-api-access-6rt96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.644356 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.648017 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.648114 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.648191 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.648249 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.648258 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.649853 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.650350 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.650376 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.652126 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.656464 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.674786 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory" (OuterVolumeSpecName: "inventory") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.682260 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ff839a7-6783-4526-a69b-66def7b3f8b4" (UID: "9ff839a7-6783-4526-a69b-66def7b3f8b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740105 4713 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740146 4713 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740159 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rt96\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-kube-api-access-6rt96\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740167 4713 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740178 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740188 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740200 4713 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740220 4713 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740230 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740239 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740248 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740256 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740264 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff839a7-6783-4526-a69b-66def7b3f8b4-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740273 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740283 4713 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:25 crc kubenswrapper[4713]: I0314 06:08:25.740291 4713 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff839a7-6783-4526-a69b-66def7b3f8b4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.064107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" event={"ID":"9ff839a7-6783-4526-a69b-66def7b3f8b4","Type":"ContainerDied","Data":"9fc08c536914a08e715ce30fde970a71fdc61b689cf5806d5f834b95f0b4a586"} Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.064152 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fc08c536914a08e715ce30fde970a71fdc61b689cf5806d5f834b95f0b4a586" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.064237 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fflms" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.160289 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8"] Mar 14 06:08:26 crc kubenswrapper[4713]: E0314 06:08:26.160763 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592b0324-5eb1-47fe-ba55-b8f2e4ca924a" containerName="oc" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.160781 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="592b0324-5eb1-47fe-ba55-b8f2e4ca924a" containerName="oc" Mar 14 06:08:26 crc kubenswrapper[4713]: E0314 06:08:26.160796 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff839a7-6783-4526-a69b-66def7b3f8b4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.160805 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff839a7-6783-4526-a69b-66def7b3f8b4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.161028 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="592b0324-5eb1-47fe-ba55-b8f2e4ca924a" containerName="oc" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.161046 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff839a7-6783-4526-a69b-66def7b3f8b4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.161812 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.164796 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.165453 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.165520 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.165573 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.165573 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.181253 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8"] Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.252179 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.252418 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.252632 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fp86\" (UniqueName: \"kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.252722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.252934 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.355693 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.355930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.356004 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fp86\" (UniqueName: \"kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.356049 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.356104 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.356792 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.359607 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.359817 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.360083 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.372829 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fp86\" (UniqueName: \"kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbkx8\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:26 crc kubenswrapper[4713]: I0314 06:08:26.478800 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:08:27 crc kubenswrapper[4713]: I0314 06:08:27.028545 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8"] Mar 14 06:08:27 crc kubenswrapper[4713]: I0314 06:08:27.078371 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" event={"ID":"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38","Type":"ContainerStarted","Data":"f09f538da9a0106a0764cc06b30b8d05103f15dc7e85a9493c6181c220e40ca8"} Mar 14 06:08:28 crc kubenswrapper[4713]: I0314 06:08:28.090754 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" event={"ID":"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38","Type":"ContainerStarted","Data":"50f30662b3431d2b0c95bdc60fdd28fbddf919932d0b181c1728b4e289f46846"} Mar 14 06:08:28 crc kubenswrapper[4713]: I0314 06:08:28.111606 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" podStartSLOduration=1.699344622 podStartE2EDuration="2.111589746s" podCreationTimestamp="2026-03-14 06:08:26 +0000 UTC" firstStartedPulling="2026-03-14 06:08:27.036721792 +0000 UTC m=+2490.124631112" lastFinishedPulling="2026-03-14 06:08:27.448966936 +0000 UTC m=+2490.536876236" observedRunningTime="2026-03-14 06:08:28.109881402 +0000 UTC m=+2491.197790712" watchObservedRunningTime="2026-03-14 06:08:28.111589746 +0000 UTC m=+2491.199499046" Mar 14 06:08:31 crc kubenswrapper[4713]: I0314 06:08:31.564844 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:08:31 crc kubenswrapper[4713]: E0314 06:08:31.565740 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:08:43 crc kubenswrapper[4713]: I0314 06:08:43.563981 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:08:43 crc kubenswrapper[4713]: E0314 06:08:43.564807 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:08:50 crc kubenswrapper[4713]: I0314 06:08:50.387871 4713 scope.go:117] "RemoveContainer" containerID="e5fba5c615ba9c66775f99bcbb12687420fb521a0f71bd92b08f265b9ff80244" Mar 14 06:08:50 crc kubenswrapper[4713]: I0314 06:08:50.434912 4713 scope.go:117] "RemoveContainer" containerID="49c4eb7f48cc08811555d4532c834ba0d19a5c8de9c78ada217a1c074a393430" Mar 14 06:08:56 crc kubenswrapper[4713]: I0314 06:08:56.564637 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:08:56 crc kubenswrapper[4713]: E0314 06:08:56.565467 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:09:09 crc kubenswrapper[4713]: I0314 06:09:09.563479 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:09:09 crc kubenswrapper[4713]: E0314 06:09:09.564123 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:09:22 crc kubenswrapper[4713]: I0314 06:09:22.563275 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:09:22 crc kubenswrapper[4713]: E0314 06:09:22.564064 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:09:30 crc kubenswrapper[4713]: I0314 06:09:30.772362 4713 generic.go:334] "Generic (PLEG): container finished" podID="93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" containerID="50f30662b3431d2b0c95bdc60fdd28fbddf919932d0b181c1728b4e289f46846" exitCode=0 Mar 14 06:09:30 crc kubenswrapper[4713]: I0314 06:09:30.772508 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" event={"ID":"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38","Type":"ContainerDied","Data":"50f30662b3431d2b0c95bdc60fdd28fbddf919932d0b181c1728b4e289f46846"} Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.274951 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.328718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam\") pod \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.328892 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0\") pod \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.329054 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory\") pod \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.329178 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fp86\" (UniqueName: \"kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86\") pod \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.329234 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle\") pod \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\" (UID: \"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38\") " Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.336143 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86" (OuterVolumeSpecName: "kube-api-access-8fp86") pod "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" (UID: "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38"). InnerVolumeSpecName "kube-api-access-8fp86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.336642 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" (UID: "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.378709 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory" (OuterVolumeSpecName: "inventory") pod "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" (UID: "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.386220 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" (UID: "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.392786 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" (UID: "93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.432969 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fp86\" (UniqueName: \"kubernetes.io/projected/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-kube-api-access-8fp86\") on node \"crc\" DevicePath \"\"" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.433356 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.433451 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.433534 4713 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.433600 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.800320 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" event={"ID":"93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38","Type":"ContainerDied","Data":"f09f538da9a0106a0764cc06b30b8d05103f15dc7e85a9493c6181c220e40ca8"} Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.800368 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09f538da9a0106a0764cc06b30b8d05103f15dc7e85a9493c6181c220e40ca8" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.800489 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbkx8" Mar 14 06:09:32 crc kubenswrapper[4713]: E0314 06:09:32.922865 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93bee75a_f41d_4e0b_8e3e_3a2c8eb63c38.slice/crio-f09f538da9a0106a0764cc06b30b8d05103f15dc7e85a9493c6181c220e40ca8\": RecentStats: unable to find data in memory cache]" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.937955 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz"] Mar 14 06:09:32 crc kubenswrapper[4713]: E0314 06:09:32.938807 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.938897 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.939243 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.940168 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.942928 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.943106 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.943282 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.943422 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.943533 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.943705 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 14 06:09:32 crc kubenswrapper[4713]: I0314 06:09:32.959259 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz"] Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.049898 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.049963 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.050000 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.050114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.050166 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.050197 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6vb\" (UniqueName: \"kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.152849 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.153281 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.153331 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6vb\" (UniqueName: \"kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.153546 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.153602 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.153687 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.157969 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.158271 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.158414 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.162001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.168742 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.171888 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6vb\" (UniqueName: \"kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.265609 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:09:33 crc kubenswrapper[4713]: I0314 06:09:33.888711 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz"] Mar 14 06:09:34 crc kubenswrapper[4713]: I0314 06:09:34.825740 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" event={"ID":"e89967bf-adf8-4756-9097-75e19857a93c","Type":"ContainerStarted","Data":"9ea928a084856d3d0590fa9ee278898e0fba21cc5bc7e55728a69b08be56c1b8"} Mar 14 06:09:35 crc kubenswrapper[4713]: I0314 06:09:35.564681 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:09:35 crc kubenswrapper[4713]: E0314 06:09:35.565452 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:09:36 crc kubenswrapper[4713]: I0314 06:09:36.849393 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" event={"ID":"e89967bf-adf8-4756-9097-75e19857a93c","Type":"ContainerStarted","Data":"9fc5673476f25494bee01baa0d6c15212754205700abf84a4aa31ca359daa9bd"} Mar 14 06:09:36 crc kubenswrapper[4713]: I0314 06:09:36.887775 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" podStartSLOduration=3.285505294 podStartE2EDuration="4.887755368s" podCreationTimestamp="2026-03-14 06:09:32 +0000 UTC" firstStartedPulling="2026-03-14 06:09:33.913336067 +0000 UTC m=+2557.001245367" lastFinishedPulling="2026-03-14 06:09:35.515586141 +0000 UTC m=+2558.603495441" observedRunningTime="2026-03-14 06:09:36.873768908 +0000 UTC m=+2559.961678228" watchObservedRunningTime="2026-03-14 06:09:36.887755368 +0000 UTC m=+2559.975664668" Mar 14 06:09:48 crc kubenswrapper[4713]: I0314 06:09:48.564361 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:09:48 crc kubenswrapper[4713]: E0314 06:09:48.565051 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.179719 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557810-xpw56"] Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.182189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.185680 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.189471 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.189614 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.196549 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557810-xpw56"] Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.293929 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57jm\" (UniqueName: \"kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm\") pod \"auto-csr-approver-29557810-xpw56\" (UID: \"dae550f4-6433-41cd-a287-1e2e6ee25c55\") " pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.396404 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57jm\" (UniqueName: \"kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm\") pod \"auto-csr-approver-29557810-xpw56\" (UID: \"dae550f4-6433-41cd-a287-1e2e6ee25c55\") " pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.425554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57jm\" (UniqueName: \"kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm\") pod \"auto-csr-approver-29557810-xpw56\" (UID: \"dae550f4-6433-41cd-a287-1e2e6ee25c55\") " pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.538651 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:00 crc kubenswrapper[4713]: I0314 06:10:00.994714 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557810-xpw56"] Mar 14 06:10:01 crc kubenswrapper[4713]: I0314 06:10:01.180228 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557810-xpw56" event={"ID":"dae550f4-6433-41cd-a287-1e2e6ee25c55","Type":"ContainerStarted","Data":"52afafaa57410e4e2da27732d16068d110d632dc07ab717840dd52bdbbe7f6a1"} Mar 14 06:10:02 crc kubenswrapper[4713]: I0314 06:10:02.563817 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:10:02 crc kubenswrapper[4713]: E0314 06:10:02.564426 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:10:03 crc kubenswrapper[4713]: I0314 06:10:03.203629 4713 generic.go:334] "Generic (PLEG): container finished" podID="dae550f4-6433-41cd-a287-1e2e6ee25c55" containerID="5ef792da3e4f3d01adec9b047aa6808e639fdd64b16f60f81406ffcc3ea78aac" exitCode=0 Mar 14 06:10:03 crc kubenswrapper[4713]: I0314 06:10:03.203689 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557810-xpw56" event={"ID":"dae550f4-6433-41cd-a287-1e2e6ee25c55","Type":"ContainerDied","Data":"5ef792da3e4f3d01adec9b047aa6808e639fdd64b16f60f81406ffcc3ea78aac"} Mar 14 06:10:04 crc kubenswrapper[4713]: I0314 06:10:04.611744 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:04 crc kubenswrapper[4713]: I0314 06:10:04.704997 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n57jm\" (UniqueName: \"kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm\") pod \"dae550f4-6433-41cd-a287-1e2e6ee25c55\" (UID: \"dae550f4-6433-41cd-a287-1e2e6ee25c55\") " Mar 14 06:10:04 crc kubenswrapper[4713]: I0314 06:10:04.711432 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm" (OuterVolumeSpecName: "kube-api-access-n57jm") pod "dae550f4-6433-41cd-a287-1e2e6ee25c55" (UID: "dae550f4-6433-41cd-a287-1e2e6ee25c55"). InnerVolumeSpecName "kube-api-access-n57jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:10:04 crc kubenswrapper[4713]: I0314 06:10:04.808714 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n57jm\" (UniqueName: \"kubernetes.io/projected/dae550f4-6433-41cd-a287-1e2e6ee25c55-kube-api-access-n57jm\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:05 crc kubenswrapper[4713]: I0314 06:10:05.226541 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557810-xpw56" event={"ID":"dae550f4-6433-41cd-a287-1e2e6ee25c55","Type":"ContainerDied","Data":"52afafaa57410e4e2da27732d16068d110d632dc07ab717840dd52bdbbe7f6a1"} Mar 14 06:10:05 crc kubenswrapper[4713]: I0314 06:10:05.226592 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52afafaa57410e4e2da27732d16068d110d632dc07ab717840dd52bdbbe7f6a1" Mar 14 06:10:05 crc kubenswrapper[4713]: I0314 06:10:05.226594 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557810-xpw56" Mar 14 06:10:05 crc kubenswrapper[4713]: I0314 06:10:05.683534 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557804-42lnq"] Mar 14 06:10:05 crc kubenswrapper[4713]: I0314 06:10:05.693602 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557804-42lnq"] Mar 14 06:10:07 crc kubenswrapper[4713]: I0314 06:10:07.580255 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21da1e63-cb2b-4604-8d9c-614007e64c5e" path="/var/lib/kubelet/pods/21da1e63-cb2b-4604-8d9c-614007e64c5e/volumes" Mar 14 06:10:17 crc kubenswrapper[4713]: I0314 06:10:17.574235 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:10:17 crc kubenswrapper[4713]: E0314 06:10:17.575217 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:10:22 crc kubenswrapper[4713]: I0314 06:10:22.404640 4713 generic.go:334] "Generic (PLEG): container finished" podID="e89967bf-adf8-4756-9097-75e19857a93c" containerID="9fc5673476f25494bee01baa0d6c15212754205700abf84a4aa31ca359daa9bd" exitCode=0 Mar 14 06:10:22 crc kubenswrapper[4713]: I0314 06:10:22.404823 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" event={"ID":"e89967bf-adf8-4756-9097-75e19857a93c","Type":"ContainerDied","Data":"9fc5673476f25494bee01baa0d6c15212754205700abf84a4aa31ca359daa9bd"} Mar 14 06:10:23 crc kubenswrapper[4713]: I0314 06:10:23.903026 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064287 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064550 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064611 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6vb\" (UniqueName: \"kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064683 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.064810 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e89967bf-adf8-4756-9097-75e19857a93c\" (UID: \"e89967bf-adf8-4756-9097-75e19857a93c\") " Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.070857 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb" (OuterVolumeSpecName: "kube-api-access-rr6vb") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "kube-api-access-rr6vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.071393 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.102336 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.104878 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.107311 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.116432 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory" (OuterVolumeSpecName: "inventory") pod "e89967bf-adf8-4756-9097-75e19857a93c" (UID: "e89967bf-adf8-4756-9097-75e19857a93c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.166834 4713 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.167065 4713 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.167135 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.167196 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6vb\" (UniqueName: \"kubernetes.io/projected/e89967bf-adf8-4756-9097-75e19857a93c-kube-api-access-rr6vb\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.167268 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.167322 4713 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e89967bf-adf8-4756-9097-75e19857a93c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.425949 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" event={"ID":"e89967bf-adf8-4756-9097-75e19857a93c","Type":"ContainerDied","Data":"9ea928a084856d3d0590fa9ee278898e0fba21cc5bc7e55728a69b08be56c1b8"} Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.426012 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.426038 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea928a084856d3d0590fa9ee278898e0fba21cc5bc7e55728a69b08be56c1b8" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.522925 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc"] Mar 14 06:10:24 crc kubenswrapper[4713]: E0314 06:10:24.523563 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae550f4-6433-41cd-a287-1e2e6ee25c55" containerName="oc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.523589 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae550f4-6433-41cd-a287-1e2e6ee25c55" containerName="oc" Mar 14 06:10:24 crc kubenswrapper[4713]: E0314 06:10:24.523638 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89967bf-adf8-4756-9097-75e19857a93c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.523649 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89967bf-adf8-4756-9097-75e19857a93c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.523928 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89967bf-adf8-4756-9097-75e19857a93c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.523956 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae550f4-6433-41cd-a287-1e2e6ee25c55" containerName="oc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.524961 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.535177 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.535297 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.535178 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.536492 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.541934 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.547969 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc"] Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.577464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.577608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.577691 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.577810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4twh7\" (UniqueName: \"kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.577850 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.680537 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4twh7\" (UniqueName: \"kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.680591 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.680702 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.681437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.681506 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.684724 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.685127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.685670 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.685764 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.699915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4twh7\" (UniqueName: \"kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:24 crc kubenswrapper[4713]: I0314 06:10:24.865920 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.471839 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc"] Mar 14 06:10:25 crc kubenswrapper[4713]: W0314 06:10:25.473010 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod068a337b_3e10_4cdf_9883_6a9311bb4424.slice/crio-c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e WatchSource:0}: Error finding container c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e: Status 404 returned error can't find the container with id c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.856659 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.860898 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.870239 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.951943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvds\" (UniqueName: \"kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.952077 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:25 crc kubenswrapper[4713]: I0314 06:10:25.952737 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.054805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.054898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvds\" (UniqueName: \"kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.054990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.055407 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.055493 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.077605 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvds\" (UniqueName: \"kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds\") pod \"redhat-marketplace-2tqlr\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.194039 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.459349 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" event={"ID":"068a337b-3e10-4cdf-9883-6a9311bb4424","Type":"ContainerStarted","Data":"c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e"} Mar 14 06:10:26 crc kubenswrapper[4713]: I0314 06:10:26.743597 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:27 crc kubenswrapper[4713]: I0314 06:10:27.473588 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" event={"ID":"068a337b-3e10-4cdf-9883-6a9311bb4424","Type":"ContainerStarted","Data":"6fcae0f7c7f9b0a39fb56a17425cef739aca56c9d0a6a40ea04cd43128d249af"} Mar 14 06:10:27 crc kubenswrapper[4713]: I0314 06:10:27.477871 4713 generic.go:334] "Generic (PLEG): container finished" podID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerID="c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9" exitCode=0 Mar 14 06:10:27 crc kubenswrapper[4713]: I0314 06:10:27.477919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerDied","Data":"c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9"} Mar 14 06:10:27 crc kubenswrapper[4713]: I0314 06:10:27.477944 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerStarted","Data":"1d36af8343e90866708cb8bd7ee8d5c705c360e23b3d060cfa96cd818d0a79ee"} Mar 14 06:10:27 crc kubenswrapper[4713]: I0314 06:10:27.505699 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" podStartSLOduration=2.8606283980000002 podStartE2EDuration="3.505679415s" podCreationTimestamp="2026-03-14 06:10:24 +0000 UTC" firstStartedPulling="2026-03-14 06:10:25.475552465 +0000 UTC m=+2608.563461765" lastFinishedPulling="2026-03-14 06:10:26.120603482 +0000 UTC m=+2609.208512782" observedRunningTime="2026-03-14 06:10:27.49185286 +0000 UTC m=+2610.579762150" watchObservedRunningTime="2026-03-14 06:10:27.505679415 +0000 UTC m=+2610.593588715" Mar 14 06:10:28 crc kubenswrapper[4713]: I0314 06:10:28.490586 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerStarted","Data":"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132"} Mar 14 06:10:28 crc kubenswrapper[4713]: I0314 06:10:28.564376 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:10:28 crc kubenswrapper[4713]: E0314 06:10:28.564670 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:10:29 crc kubenswrapper[4713]: I0314 06:10:29.504279 4713 generic.go:334] "Generic (PLEG): container finished" podID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerID="78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132" exitCode=0 Mar 14 06:10:29 crc kubenswrapper[4713]: I0314 06:10:29.504347 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerDied","Data":"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132"} Mar 14 06:10:30 crc kubenswrapper[4713]: I0314 06:10:30.519086 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerStarted","Data":"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab"} Mar 14 06:10:30 crc kubenswrapper[4713]: I0314 06:10:30.540037 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tqlr" podStartSLOduration=3.08752333 podStartE2EDuration="5.540009263s" podCreationTimestamp="2026-03-14 06:10:25 +0000 UTC" firstStartedPulling="2026-03-14 06:10:27.480091889 +0000 UTC m=+2610.568001189" lastFinishedPulling="2026-03-14 06:10:29.932577802 +0000 UTC m=+2613.020487122" observedRunningTime="2026-03-14 06:10:30.537284137 +0000 UTC m=+2613.625193457" watchObservedRunningTime="2026-03-14 06:10:30.540009263 +0000 UTC m=+2613.627918593" Mar 14 06:10:36 crc kubenswrapper[4713]: I0314 06:10:36.194649 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:36 crc kubenswrapper[4713]: I0314 06:10:36.195171 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:36 crc kubenswrapper[4713]: I0314 06:10:36.242855 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:36 crc kubenswrapper[4713]: I0314 06:10:36.661538 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:36 crc kubenswrapper[4713]: I0314 06:10:36.751292 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:38 crc kubenswrapper[4713]: I0314 06:10:38.601899 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tqlr" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="registry-server" containerID="cri-o://b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab" gracePeriod=2 Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.140493 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.274077 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content\") pod \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.274291 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities\") pod \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.274693 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gvds\" (UniqueName: \"kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds\") pod \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\" (UID: \"58cc3c73-af6f-45cf-a041-b54b962a0b4f\") " Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.275058 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities" (OuterVolumeSpecName: "utilities") pod "58cc3c73-af6f-45cf-a041-b54b962a0b4f" (UID: "58cc3c73-af6f-45cf-a041-b54b962a0b4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.275537 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.297173 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds" (OuterVolumeSpecName: "kube-api-access-6gvds") pod "58cc3c73-af6f-45cf-a041-b54b962a0b4f" (UID: "58cc3c73-af6f-45cf-a041-b54b962a0b4f"). InnerVolumeSpecName "kube-api-access-6gvds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.298488 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58cc3c73-af6f-45cf-a041-b54b962a0b4f" (UID: "58cc3c73-af6f-45cf-a041-b54b962a0b4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.378041 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gvds\" (UniqueName: \"kubernetes.io/projected/58cc3c73-af6f-45cf-a041-b54b962a0b4f-kube-api-access-6gvds\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.378075 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc3c73-af6f-45cf-a041-b54b962a0b4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.613068 4713 generic.go:334] "Generic (PLEG): container finished" podID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerID="b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab" exitCode=0 Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.613109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerDied","Data":"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab"} Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.613135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tqlr" event={"ID":"58cc3c73-af6f-45cf-a041-b54b962a0b4f","Type":"ContainerDied","Data":"1d36af8343e90866708cb8bd7ee8d5c705c360e23b3d060cfa96cd818d0a79ee"} Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.613151 4713 scope.go:117] "RemoveContainer" containerID="b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.613285 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tqlr" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.638145 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.648561 4713 scope.go:117] "RemoveContainer" containerID="78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.658188 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tqlr"] Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.667554 4713 scope.go:117] "RemoveContainer" containerID="c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.720807 4713 scope.go:117] "RemoveContainer" containerID="b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab" Mar 14 06:10:39 crc kubenswrapper[4713]: E0314 06:10:39.721141 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab\": container with ID starting with b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab not found: ID does not exist" containerID="b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.721179 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab"} err="failed to get container status \"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab\": rpc error: code = NotFound desc = could not find container \"b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab\": container with ID starting with b31dd89fef3284b4b2d71f17983af0e7ee395f6777a3151bc22381e6bce545ab not found: ID does not exist" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.721217 4713 scope.go:117] "RemoveContainer" containerID="78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132" Mar 14 06:10:39 crc kubenswrapper[4713]: E0314 06:10:39.721505 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132\": container with ID starting with 78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132 not found: ID does not exist" containerID="78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.721542 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132"} err="failed to get container status \"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132\": rpc error: code = NotFound desc = could not find container \"78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132\": container with ID starting with 78fc48e25a7f4a43364256c18375975d52f2ff70f1ea1c6ed9aa461e488c3132 not found: ID does not exist" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.721561 4713 scope.go:117] "RemoveContainer" containerID="c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9" Mar 14 06:10:39 crc kubenswrapper[4713]: E0314 06:10:39.721815 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9\": container with ID starting with c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9 not found: ID does not exist" containerID="c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9" Mar 14 06:10:39 crc kubenswrapper[4713]: I0314 06:10:39.721841 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9"} err="failed to get container status \"c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9\": rpc error: code = NotFound desc = could not find container \"c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9\": container with ID starting with c9c5ef1ce46681c38f1db4be8e949d89aff452d2da559eb13d5b49317ef87eb9 not found: ID does not exist" Mar 14 06:10:41 crc kubenswrapper[4713]: I0314 06:10:41.564470 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:10:41 crc kubenswrapper[4713]: E0314 06:10:41.565279 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:10:41 crc kubenswrapper[4713]: I0314 06:10:41.579636 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" path="/var/lib/kubelet/pods/58cc3c73-af6f-45cf-a041-b54b962a0b4f/volumes" Mar 14 06:10:50 crc kubenswrapper[4713]: I0314 06:10:50.576285 4713 scope.go:117] "RemoveContainer" containerID="b272a9bb621b7373c94ad09998c664d94abbe89e2ff50ee6e650d3afbf514303" Mar 14 06:10:55 crc kubenswrapper[4713]: I0314 06:10:55.565006 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:10:55 crc kubenswrapper[4713]: E0314 06:10:55.565921 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:11:07 crc kubenswrapper[4713]: I0314 06:11:07.572226 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:11:07 crc kubenswrapper[4713]: E0314 06:11:07.573089 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:11:19 crc kubenswrapper[4713]: I0314 06:11:19.564522 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:11:19 crc kubenswrapper[4713]: E0314 06:11:19.565296 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:11:34 crc kubenswrapper[4713]: I0314 06:11:34.565379 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:11:34 crc kubenswrapper[4713]: E0314 06:11:34.566228 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:11:47 crc kubenswrapper[4713]: I0314 06:11:47.574479 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:11:47 crc kubenswrapper[4713]: E0314 06:11:47.575440 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:11:59 crc kubenswrapper[4713]: I0314 06:11:59.565318 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:11:59 crc kubenswrapper[4713]: E0314 06:11:59.566453 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.157493 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557812-2bxlg"] Mar 14 06:12:00 crc kubenswrapper[4713]: E0314 06:12:00.158158 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="registry-server" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.158182 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="registry-server" Mar 14 06:12:00 crc kubenswrapper[4713]: E0314 06:12:00.158267 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="extract-utilities" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.158279 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="extract-utilities" Mar 14 06:12:00 crc kubenswrapper[4713]: E0314 06:12:00.158296 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="extract-content" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.158306 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="extract-content" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.158606 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cc3c73-af6f-45cf-a041-b54b962a0b4f" containerName="registry-server" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.159657 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.163375 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.164085 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.164353 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.202761 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557812-2bxlg"] Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.339498 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldfzv\" (UniqueName: \"kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv\") pod \"auto-csr-approver-29557812-2bxlg\" (UID: \"7a261a9d-d943-4ef6-a98d-5674ac862f1b\") " pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.441963 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldfzv\" (UniqueName: \"kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv\") pod \"auto-csr-approver-29557812-2bxlg\" (UID: \"7a261a9d-d943-4ef6-a98d-5674ac862f1b\") " pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.471370 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldfzv\" (UniqueName: \"kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv\") pod \"auto-csr-approver-29557812-2bxlg\" (UID: \"7a261a9d-d943-4ef6-a98d-5674ac862f1b\") " pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.485934 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:00 crc kubenswrapper[4713]: I0314 06:12:00.801951 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557812-2bxlg"] Mar 14 06:12:01 crc kubenswrapper[4713]: I0314 06:12:01.312756 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" event={"ID":"7a261a9d-d943-4ef6-a98d-5674ac862f1b","Type":"ContainerStarted","Data":"f6f410e5eed60ef63404a45aa646485229ea4291dde9b97de19f2d560f1620f8"} Mar 14 06:12:02 crc kubenswrapper[4713]: I0314 06:12:02.326141 4713 generic.go:334] "Generic (PLEG): container finished" podID="7a261a9d-d943-4ef6-a98d-5674ac862f1b" containerID="fb49f780a07db2887ed615d84ecde34d3e2fc213ccd783a57681949b70638d09" exitCode=0 Mar 14 06:12:02 crc kubenswrapper[4713]: I0314 06:12:02.326245 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" event={"ID":"7a261a9d-d943-4ef6-a98d-5674ac862f1b","Type":"ContainerDied","Data":"fb49f780a07db2887ed615d84ecde34d3e2fc213ccd783a57681949b70638d09"} Mar 14 06:12:03 crc kubenswrapper[4713]: I0314 06:12:03.743098 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:03 crc kubenswrapper[4713]: I0314 06:12:03.938409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldfzv\" (UniqueName: \"kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv\") pod \"7a261a9d-d943-4ef6-a98d-5674ac862f1b\" (UID: \"7a261a9d-d943-4ef6-a98d-5674ac862f1b\") " Mar 14 06:12:03 crc kubenswrapper[4713]: I0314 06:12:03.975087 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv" (OuterVolumeSpecName: "kube-api-access-ldfzv") pod "7a261a9d-d943-4ef6-a98d-5674ac862f1b" (UID: "7a261a9d-d943-4ef6-a98d-5674ac862f1b"). InnerVolumeSpecName "kube-api-access-ldfzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.050693 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldfzv\" (UniqueName: \"kubernetes.io/projected/7a261a9d-d943-4ef6-a98d-5674ac862f1b-kube-api-access-ldfzv\") on node \"crc\" DevicePath \"\"" Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.351934 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" event={"ID":"7a261a9d-d943-4ef6-a98d-5674ac862f1b","Type":"ContainerDied","Data":"f6f410e5eed60ef63404a45aa646485229ea4291dde9b97de19f2d560f1620f8"} Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.351978 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f410e5eed60ef63404a45aa646485229ea4291dde9b97de19f2d560f1620f8" Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.352040 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557812-2bxlg" Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.846820 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557806-q87kz"] Mar 14 06:12:04 crc kubenswrapper[4713]: I0314 06:12:04.856949 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557806-q87kz"] Mar 14 06:12:05 crc kubenswrapper[4713]: I0314 06:12:05.579248 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb67cc5-82d7-41b1-be5e-308a496a8944" path="/var/lib/kubelet/pods/0eb67cc5-82d7-41b1-be5e-308a496a8944/volumes" Mar 14 06:12:12 crc kubenswrapper[4713]: I0314 06:12:12.564023 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:12:12 crc kubenswrapper[4713]: E0314 06:12:12.564886 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:12:27 crc kubenswrapper[4713]: I0314 06:12:27.575310 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:12:27 crc kubenswrapper[4713]: E0314 06:12:27.577109 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.424285 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:12:37 crc kubenswrapper[4713]: E0314 06:12:37.425599 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a261a9d-d943-4ef6-a98d-5674ac862f1b" containerName="oc" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.425618 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a261a9d-d943-4ef6-a98d-5674ac862f1b" containerName="oc" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.425974 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a261a9d-d943-4ef6-a98d-5674ac862f1b" containerName="oc" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.428973 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.443148 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.564370 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjtb6\" (UniqueName: \"kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.564631 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.565039 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.667639 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.667717 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjtb6\" (UniqueName: \"kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.667821 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.668240 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.668681 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.690591 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjtb6\" (UniqueName: \"kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6\") pod \"redhat-operators-xrjlv\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:37 crc kubenswrapper[4713]: I0314 06:12:37.754952 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.363181 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.564393 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:12:38 crc kubenswrapper[4713]: E0314 06:12:38.564924 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.730823 4713 generic.go:334] "Generic (PLEG): container finished" podID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerID="ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0" exitCode=0 Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.730867 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerDied","Data":"ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0"} Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.730889 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerStarted","Data":"2558b6525d356cef030163f57d4771a70420845fb9d33d879d4dea65be8b4e6c"} Mar 14 06:12:38 crc kubenswrapper[4713]: I0314 06:12:38.733198 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:12:39 crc kubenswrapper[4713]: I0314 06:12:39.747129 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerStarted","Data":"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2"} Mar 14 06:12:45 crc kubenswrapper[4713]: I0314 06:12:45.816519 4713 generic.go:334] "Generic (PLEG): container finished" podID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerID="8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2" exitCode=0 Mar 14 06:12:45 crc kubenswrapper[4713]: I0314 06:12:45.816594 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerDied","Data":"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2"} Mar 14 06:12:46 crc kubenswrapper[4713]: I0314 06:12:46.831960 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerStarted","Data":"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e"} Mar 14 06:12:46 crc kubenswrapper[4713]: I0314 06:12:46.859447 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrjlv" podStartSLOduration=2.325135519 podStartE2EDuration="9.859416096s" podCreationTimestamp="2026-03-14 06:12:37 +0000 UTC" firstStartedPulling="2026-03-14 06:12:38.732870285 +0000 UTC m=+2741.820779585" lastFinishedPulling="2026-03-14 06:12:46.267150862 +0000 UTC m=+2749.355060162" observedRunningTime="2026-03-14 06:12:46.855815613 +0000 UTC m=+2749.943724933" watchObservedRunningTime="2026-03-14 06:12:46.859416096 +0000 UTC m=+2749.947325436" Mar 14 06:12:47 crc kubenswrapper[4713]: I0314 06:12:47.755165 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:47 crc kubenswrapper[4713]: I0314 06:12:47.755547 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:48 crc kubenswrapper[4713]: I0314 06:12:48.812383 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrjlv" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="registry-server" probeResult="failure" output=< Mar 14 06:12:48 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:12:48 crc kubenswrapper[4713]: > Mar 14 06:12:50 crc kubenswrapper[4713]: I0314 06:12:50.573663 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:12:50 crc kubenswrapper[4713]: I0314 06:12:50.695836 4713 scope.go:117] "RemoveContainer" containerID="cab44d3e44aa2cb3c991dcd7f30964d4168812a6d2b8b4aec45586cf4963b566" Mar 14 06:12:51 crc kubenswrapper[4713]: I0314 06:12:51.895067 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7"} Mar 14 06:12:57 crc kubenswrapper[4713]: I0314 06:12:57.803458 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:57 crc kubenswrapper[4713]: I0314 06:12:57.870035 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:58 crc kubenswrapper[4713]: I0314 06:12:58.045574 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:12:58 crc kubenswrapper[4713]: I0314 06:12:58.979319 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrjlv" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="registry-server" containerID="cri-o://1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e" gracePeriod=2 Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.841398 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.990908 4713 generic.go:334] "Generic (PLEG): container finished" podID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerID="1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e" exitCode=0 Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.991191 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrjlv" Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.991106 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerDied","Data":"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e"} Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.991325 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrjlv" event={"ID":"bdc49afe-518e-4d99-b6c1-642fd565abbe","Type":"ContainerDied","Data":"2558b6525d356cef030163f57d4771a70420845fb9d33d879d4dea65be8b4e6c"} Mar 14 06:12:59 crc kubenswrapper[4713]: I0314 06:12:59.991353 4713 scope.go:117] "RemoveContainer" containerID="1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.012669 4713 scope.go:117] "RemoveContainer" containerID="8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.027303 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjtb6\" (UniqueName: \"kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6\") pod \"bdc49afe-518e-4d99-b6c1-642fd565abbe\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.027684 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content\") pod \"bdc49afe-518e-4d99-b6c1-642fd565abbe\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.027768 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities\") pod \"bdc49afe-518e-4d99-b6c1-642fd565abbe\" (UID: \"bdc49afe-518e-4d99-b6c1-642fd565abbe\") " Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.028849 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities" (OuterVolumeSpecName: "utilities") pod "bdc49afe-518e-4d99-b6c1-642fd565abbe" (UID: "bdc49afe-518e-4d99-b6c1-642fd565abbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.033868 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6" (OuterVolumeSpecName: "kube-api-access-tjtb6") pod "bdc49afe-518e-4d99-b6c1-642fd565abbe" (UID: "bdc49afe-518e-4d99-b6c1-642fd565abbe"). InnerVolumeSpecName "kube-api-access-tjtb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.035066 4713 scope.go:117] "RemoveContainer" containerID="ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.133165 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.133199 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjtb6\" (UniqueName: \"kubernetes.io/projected/bdc49afe-518e-4d99-b6c1-642fd565abbe-kube-api-access-tjtb6\") on node \"crc\" DevicePath \"\"" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.135830 4713 scope.go:117] "RemoveContainer" containerID="1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e" Mar 14 06:13:00 crc kubenswrapper[4713]: E0314 06:13:00.136869 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e\": container with ID starting with 1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e not found: ID does not exist" containerID="1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.136988 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e"} err="failed to get container status \"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e\": rpc error: code = NotFound desc = could not find container \"1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e\": container with ID starting with 1c92eb45863e1766da5574e422a6e1e7871282be356804f18e6354ad5258bf7e not found: ID does not exist" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.137096 4713 scope.go:117] "RemoveContainer" containerID="8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2" Mar 14 06:13:00 crc kubenswrapper[4713]: E0314 06:13:00.137556 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2\": container with ID starting with 8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2 not found: ID does not exist" containerID="8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.137602 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2"} err="failed to get container status \"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2\": rpc error: code = NotFound desc = could not find container \"8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2\": container with ID starting with 8273d067eeba36d8bc2bb18ef612c70364bf35b47d52d37e239b1af190b8d4f2 not found: ID does not exist" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.137638 4713 scope.go:117] "RemoveContainer" containerID="ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0" Mar 14 06:13:00 crc kubenswrapper[4713]: E0314 06:13:00.137924 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0\": container with ID starting with ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0 not found: ID does not exist" containerID="ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.137980 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0"} err="failed to get container status \"ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0\": rpc error: code = NotFound desc = could not find container \"ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0\": container with ID starting with ad2a2db68546e234f798e8c416a9e266bfa84dcc857989aed5cb52b347e1f9e0 not found: ID does not exist" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.164358 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdc49afe-518e-4d99-b6c1-642fd565abbe" (UID: "bdc49afe-518e-4d99-b6c1-642fd565abbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.237088 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc49afe-518e-4d99-b6c1-642fd565abbe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.347539 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:13:00 crc kubenswrapper[4713]: I0314 06:13:00.360467 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrjlv"] Mar 14 06:13:01 crc kubenswrapper[4713]: I0314 06:13:01.576874 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" path="/var/lib/kubelet/pods/bdc49afe-518e-4d99-b6c1-642fd565abbe/volumes" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.145289 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557814-kjzb5"] Mar 14 06:14:00 crc kubenswrapper[4713]: E0314 06:14:00.146337 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="extract-utilities" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.146354 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="extract-utilities" Mar 14 06:14:00 crc kubenswrapper[4713]: E0314 06:14:00.146369 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="extract-content" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.146377 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="extract-content" Mar 14 06:14:00 crc kubenswrapper[4713]: E0314 06:14:00.146410 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="registry-server" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.146418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="registry-server" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.146730 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc49afe-518e-4d99-b6c1-642fd565abbe" containerName="registry-server" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.147755 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.156460 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557814-kjzb5"] Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.175836 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.175844 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.175896 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.329015 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7sc\" (UniqueName: \"kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc\") pod \"auto-csr-approver-29557814-kjzb5\" (UID: \"93afc687-91a0-4ddb-8f39-9ce3099939e0\") " pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.430851 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7sc\" (UniqueName: \"kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc\") pod \"auto-csr-approver-29557814-kjzb5\" (UID: \"93afc687-91a0-4ddb-8f39-9ce3099939e0\") " pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.454626 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7sc\" (UniqueName: \"kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc\") pod \"auto-csr-approver-29557814-kjzb5\" (UID: \"93afc687-91a0-4ddb-8f39-9ce3099939e0\") " pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.492414 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:00 crc kubenswrapper[4713]: I0314 06:14:00.986372 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557814-kjzb5"] Mar 14 06:14:01 crc kubenswrapper[4713]: I0314 06:14:01.649489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" event={"ID":"93afc687-91a0-4ddb-8f39-9ce3099939e0","Type":"ContainerStarted","Data":"0ee25256280154c599bf6278b0da45342f3c45ef8bbbba4e7a74ea7c55e5841a"} Mar 14 06:14:02 crc kubenswrapper[4713]: I0314 06:14:02.662173 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" event={"ID":"93afc687-91a0-4ddb-8f39-9ce3099939e0","Type":"ContainerStarted","Data":"3ac76fdc72804ef071b957dded3f05ae659d5e0871949d310baf190732c7e4f1"} Mar 14 06:14:02 crc kubenswrapper[4713]: I0314 06:14:02.685117 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" podStartSLOduration=1.696417237 podStartE2EDuration="2.685094516s" podCreationTimestamp="2026-03-14 06:14:00 +0000 UTC" firstStartedPulling="2026-03-14 06:14:00.976315977 +0000 UTC m=+2824.064225277" lastFinishedPulling="2026-03-14 06:14:01.964993256 +0000 UTC m=+2825.052902556" observedRunningTime="2026-03-14 06:14:02.679732867 +0000 UTC m=+2825.767642167" watchObservedRunningTime="2026-03-14 06:14:02.685094516 +0000 UTC m=+2825.773003816" Mar 14 06:14:03 crc kubenswrapper[4713]: I0314 06:14:03.677641 4713 generic.go:334] "Generic (PLEG): container finished" podID="93afc687-91a0-4ddb-8f39-9ce3099939e0" containerID="3ac76fdc72804ef071b957dded3f05ae659d5e0871949d310baf190732c7e4f1" exitCode=0 Mar 14 06:14:03 crc kubenswrapper[4713]: I0314 06:14:03.677703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" event={"ID":"93afc687-91a0-4ddb-8f39-9ce3099939e0","Type":"ContainerDied","Data":"3ac76fdc72804ef071b957dded3f05ae659d5e0871949d310baf190732c7e4f1"} Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.070012 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.171301 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7sc\" (UniqueName: \"kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc\") pod \"93afc687-91a0-4ddb-8f39-9ce3099939e0\" (UID: \"93afc687-91a0-4ddb-8f39-9ce3099939e0\") " Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.177442 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc" (OuterVolumeSpecName: "kube-api-access-dl7sc") pod "93afc687-91a0-4ddb-8f39-9ce3099939e0" (UID: "93afc687-91a0-4ddb-8f39-9ce3099939e0"). InnerVolumeSpecName "kube-api-access-dl7sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.274734 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7sc\" (UniqueName: \"kubernetes.io/projected/93afc687-91a0-4ddb-8f39-9ce3099939e0-kube-api-access-dl7sc\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.709617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" event={"ID":"93afc687-91a0-4ddb-8f39-9ce3099939e0","Type":"ContainerDied","Data":"0ee25256280154c599bf6278b0da45342f3c45ef8bbbba4e7a74ea7c55e5841a"} Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.710016 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ee25256280154c599bf6278b0da45342f3c45ef8bbbba4e7a74ea7c55e5841a" Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.709731 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557814-kjzb5" Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.746556 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557808-mfw6r"] Mar 14 06:14:05 crc kubenswrapper[4713]: I0314 06:14:05.756837 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557808-mfw6r"] Mar 14 06:14:07 crc kubenswrapper[4713]: I0314 06:14:07.578514 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592b0324-5eb1-47fe-ba55-b8f2e4ca924a" path="/var/lib/kubelet/pods/592b0324-5eb1-47fe-ba55-b8f2e4ca924a/volumes" Mar 14 06:14:12 crc kubenswrapper[4713]: I0314 06:14:12.807836 4713 generic.go:334] "Generic (PLEG): container finished" podID="068a337b-3e10-4cdf-9883-6a9311bb4424" containerID="6fcae0f7c7f9b0a39fb56a17425cef739aca56c9d0a6a40ea04cd43128d249af" exitCode=0 Mar 14 06:14:12 crc kubenswrapper[4713]: I0314 06:14:12.807914 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" event={"ID":"068a337b-3e10-4cdf-9883-6a9311bb4424","Type":"ContainerDied","Data":"6fcae0f7c7f9b0a39fb56a17425cef739aca56c9d0a6a40ea04cd43128d249af"} Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.385425 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.535220 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4twh7\" (UniqueName: \"kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7\") pod \"068a337b-3e10-4cdf-9883-6a9311bb4424\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.535314 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle\") pod \"068a337b-3e10-4cdf-9883-6a9311bb4424\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.535384 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam\") pod \"068a337b-3e10-4cdf-9883-6a9311bb4424\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.535478 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory\") pod \"068a337b-3e10-4cdf-9883-6a9311bb4424\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.535558 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0\") pod \"068a337b-3e10-4cdf-9883-6a9311bb4424\" (UID: \"068a337b-3e10-4cdf-9883-6a9311bb4424\") " Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.550575 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7" (OuterVolumeSpecName: "kube-api-access-4twh7") pod "068a337b-3e10-4cdf-9883-6a9311bb4424" (UID: "068a337b-3e10-4cdf-9883-6a9311bb4424"). InnerVolumeSpecName "kube-api-access-4twh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.550675 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "068a337b-3e10-4cdf-9883-6a9311bb4424" (UID: "068a337b-3e10-4cdf-9883-6a9311bb4424"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.574742 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory" (OuterVolumeSpecName: "inventory") pod "068a337b-3e10-4cdf-9883-6a9311bb4424" (UID: "068a337b-3e10-4cdf-9883-6a9311bb4424"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.580448 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "068a337b-3e10-4cdf-9883-6a9311bb4424" (UID: "068a337b-3e10-4cdf-9883-6a9311bb4424"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.583976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "068a337b-3e10-4cdf-9883-6a9311bb4424" (UID: "068a337b-3e10-4cdf-9883-6a9311bb4424"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.638744 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.638790 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.638807 4713 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.638819 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4twh7\" (UniqueName: \"kubernetes.io/projected/068a337b-3e10-4cdf-9883-6a9311bb4424-kube-api-access-4twh7\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.638833 4713 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068a337b-3e10-4cdf-9883-6a9311bb4424-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.833995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" event={"ID":"068a337b-3e10-4cdf-9883-6a9311bb4424","Type":"ContainerDied","Data":"c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e"} Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.834041 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c271c59b477bb8e483c9f1fe87c17c783a0131a41a2a1044b79f3bebc54b0d7e" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.834066 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.921329 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw"] Mar 14 06:14:14 crc kubenswrapper[4713]: E0314 06:14:14.922046 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93afc687-91a0-4ddb-8f39-9ce3099939e0" containerName="oc" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.922065 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="93afc687-91a0-4ddb-8f39-9ce3099939e0" containerName="oc" Mar 14 06:14:14 crc kubenswrapper[4713]: E0314 06:14:14.922087 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068a337b-3e10-4cdf-9883-6a9311bb4424" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.922095 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="068a337b-3e10-4cdf-9883-6a9311bb4424" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.922414 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="93afc687-91a0-4ddb-8f39-9ce3099939e0" containerName="oc" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.922435 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="068a337b-3e10-4cdf-9883-6a9311bb4424" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.923828 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.927762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.928170 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.928695 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.929683 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.933583 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.933840 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.934067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:14:14 crc kubenswrapper[4713]: I0314 06:14:14.939239 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw"] Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.048580 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049149 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049227 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snblw\" (UniqueName: \"kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049307 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049419 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049440 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049468 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.049498 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.151758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152356 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152764 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.152998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.153271 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.153364 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.153450 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.153507 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snblw\" (UniqueName: \"kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.154286 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.156695 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.157766 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.158272 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.158335 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.158643 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.159424 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.163329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.163825 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.164074 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.181031 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snblw\" (UniqueName: \"kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4tcw\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.247522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:14:15 crc kubenswrapper[4713]: I0314 06:14:15.909346 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw"] Mar 14 06:14:16 crc kubenswrapper[4713]: I0314 06:14:16.857430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" event={"ID":"713308d3-fe7b-40f0-84b6-671a2defaf7b","Type":"ContainerStarted","Data":"4c0b31cb1ddcca145d51f75717a28454762730a7c0891e353b2f75ad53933c6b"} Mar 14 06:14:16 crc kubenswrapper[4713]: I0314 06:14:16.857709 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" event={"ID":"713308d3-fe7b-40f0-84b6-671a2defaf7b","Type":"ContainerStarted","Data":"190db2e20f7213d3e5c52910ce99ac28e68335294a81a6b977679b4f3168dd79"} Mar 14 06:14:16 crc kubenswrapper[4713]: I0314 06:14:16.886800 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" podStartSLOduration=2.301880125 podStartE2EDuration="2.886781527s" podCreationTimestamp="2026-03-14 06:14:14 +0000 UTC" firstStartedPulling="2026-03-14 06:14:15.914716031 +0000 UTC m=+2839.002625321" lastFinishedPulling="2026-03-14 06:14:16.499617423 +0000 UTC m=+2839.587526723" observedRunningTime="2026-03-14 06:14:16.882390998 +0000 UTC m=+2839.970300298" watchObservedRunningTime="2026-03-14 06:14:16.886781527 +0000 UTC m=+2839.974690827" Mar 14 06:14:50 crc kubenswrapper[4713]: I0314 06:14:50.830052 4713 scope.go:117] "RemoveContainer" containerID="94a46d1f8713447717f606fbe3485991f7bdc44a979086a93820f84856babd29" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.163367 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h"] Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.165981 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.169552 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.169882 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.178330 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h"] Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.331022 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.331454 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.331679 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmvd\" (UniqueName: \"kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.433659 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.433741 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.433852 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmvd\" (UniqueName: \"kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.435013 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.440648 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.448551 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmvd\" (UniqueName: \"kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd\") pod \"collect-profiles-29557815-p9p5h\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.524565 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:00 crc kubenswrapper[4713]: I0314 06:15:00.985746 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h"] Mar 14 06:15:00 crc kubenswrapper[4713]: W0314 06:15:00.998669 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf42096c_26c2_4b01_9771_89caa06e1293.slice/crio-7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c WatchSource:0}: Error finding container 7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c: Status 404 returned error can't find the container with id 7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c Mar 14 06:15:01 crc kubenswrapper[4713]: I0314 06:15:01.367767 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" event={"ID":"bf42096c-26c2-4b01-9771-89caa06e1293","Type":"ContainerStarted","Data":"78ab633a44b9185e7ad5ff00aab9466bd19f48c55744f7ac689a93c5062a3d99"} Mar 14 06:15:01 crc kubenswrapper[4713]: I0314 06:15:01.367820 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" event={"ID":"bf42096c-26c2-4b01-9771-89caa06e1293","Type":"ContainerStarted","Data":"7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c"} Mar 14 06:15:01 crc kubenswrapper[4713]: I0314 06:15:01.395526 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" podStartSLOduration=1.3955068210000001 podStartE2EDuration="1.395506821s" podCreationTimestamp="2026-03-14 06:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 06:15:01.391522956 +0000 UTC m=+2884.479432296" watchObservedRunningTime="2026-03-14 06:15:01.395506821 +0000 UTC m=+2884.483416121" Mar 14 06:15:02 crc kubenswrapper[4713]: I0314 06:15:02.379305 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf42096c-26c2-4b01-9771-89caa06e1293" containerID="78ab633a44b9185e7ad5ff00aab9466bd19f48c55744f7ac689a93c5062a3d99" exitCode=0 Mar 14 06:15:02 crc kubenswrapper[4713]: I0314 06:15:02.379388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" event={"ID":"bf42096c-26c2-4b01-9771-89caa06e1293","Type":"ContainerDied","Data":"78ab633a44b9185e7ad5ff00aab9466bd19f48c55744f7ac689a93c5062a3d99"} Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.854199 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.939978 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnmvd\" (UniqueName: \"kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd\") pod \"bf42096c-26c2-4b01-9771-89caa06e1293\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.940875 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume\") pod \"bf42096c-26c2-4b01-9771-89caa06e1293\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.941436 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume\") pod \"bf42096c-26c2-4b01-9771-89caa06e1293\" (UID: \"bf42096c-26c2-4b01-9771-89caa06e1293\") " Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.942387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf42096c-26c2-4b01-9771-89caa06e1293" (UID: "bf42096c-26c2-4b01-9771-89caa06e1293"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.943969 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf42096c-26c2-4b01-9771-89caa06e1293-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.948317 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd" (OuterVolumeSpecName: "kube-api-access-fnmvd") pod "bf42096c-26c2-4b01-9771-89caa06e1293" (UID: "bf42096c-26c2-4b01-9771-89caa06e1293"). InnerVolumeSpecName "kube-api-access-fnmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:15:03 crc kubenswrapper[4713]: I0314 06:15:03.952783 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf42096c-26c2-4b01-9771-89caa06e1293" (UID: "bf42096c-26c2-4b01-9771-89caa06e1293"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.045563 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf42096c-26c2-4b01-9771-89caa06e1293-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.045592 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnmvd\" (UniqueName: \"kubernetes.io/projected/bf42096c-26c2-4b01-9771-89caa06e1293-kube-api-access-fnmvd\") on node \"crc\" DevicePath \"\"" Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.417805 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" event={"ID":"bf42096c-26c2-4b01-9771-89caa06e1293","Type":"ContainerDied","Data":"7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c"} Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.417850 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c157528f082ab3ac03243be9dbd4ed0898565d126e86a98c5eb6334430b553c" Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.417914 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h" Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.473318 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29"] Mar 14 06:15:04 crc kubenswrapper[4713]: I0314 06:15:04.488373 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557770-vln29"] Mar 14 06:15:05 crc kubenswrapper[4713]: I0314 06:15:05.577524 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2608a13-9ee1-45ed-926b-329192ef4d34" path="/var/lib/kubelet/pods/d2608a13-9ee1-45ed-926b-329192ef4d34/volumes" Mar 14 06:15:10 crc kubenswrapper[4713]: I0314 06:15:10.732005 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:15:10 crc kubenswrapper[4713]: I0314 06:15:10.732662 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:15:40 crc kubenswrapper[4713]: I0314 06:15:40.731898 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:15:40 crc kubenswrapper[4713]: I0314 06:15:40.732603 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:15:50 crc kubenswrapper[4713]: I0314 06:15:50.931022 4713 scope.go:117] "RemoveContainer" containerID="58d1e11cdb6b911204b129bf9878c939b127cfb6725ecab7d2093c69416fa9b3" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.144864 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557816-z8vqm"] Mar 14 06:16:00 crc kubenswrapper[4713]: E0314 06:16:00.146114 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf42096c-26c2-4b01-9771-89caa06e1293" containerName="collect-profiles" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.146131 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf42096c-26c2-4b01-9771-89caa06e1293" containerName="collect-profiles" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.146511 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf42096c-26c2-4b01-9771-89caa06e1293" containerName="collect-profiles" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.147701 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.151123 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.151192 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.151135 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.156408 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557816-z8vqm"] Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.353246 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9xmv\" (UniqueName: \"kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv\") pod \"auto-csr-approver-29557816-z8vqm\" (UID: \"47e826e6-870e-4d27-afbe-fe1547e39e61\") " pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.455681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9xmv\" (UniqueName: \"kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv\") pod \"auto-csr-approver-29557816-z8vqm\" (UID: \"47e826e6-870e-4d27-afbe-fe1547e39e61\") " pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.478897 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9xmv\" (UniqueName: \"kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv\") pod \"auto-csr-approver-29557816-z8vqm\" (UID: \"47e826e6-870e-4d27-afbe-fe1547e39e61\") " pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:00 crc kubenswrapper[4713]: I0314 06:16:00.769433 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:01 crc kubenswrapper[4713]: I0314 06:16:01.281936 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557816-z8vqm"] Mar 14 06:16:02 crc kubenswrapper[4713]: I0314 06:16:02.041607 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" event={"ID":"47e826e6-870e-4d27-afbe-fe1547e39e61","Type":"ContainerStarted","Data":"3f7eefc6e407029cad6ef2e2d06c185adffbfbd9d8c66c03e3faabc251b42185"} Mar 14 06:16:03 crc kubenswrapper[4713]: I0314 06:16:03.054911 4713 generic.go:334] "Generic (PLEG): container finished" podID="47e826e6-870e-4d27-afbe-fe1547e39e61" containerID="9ae085736bd56fae1c869c5579fc76ee2f37023947d37d167da40faf6384bb2a" exitCode=0 Mar 14 06:16:03 crc kubenswrapper[4713]: I0314 06:16:03.054986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" event={"ID":"47e826e6-870e-4d27-afbe-fe1547e39e61","Type":"ContainerDied","Data":"9ae085736bd56fae1c869c5579fc76ee2f37023947d37d167da40faf6384bb2a"} Mar 14 06:16:04 crc kubenswrapper[4713]: I0314 06:16:04.491360 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:04 crc kubenswrapper[4713]: I0314 06:16:04.695069 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9xmv\" (UniqueName: \"kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv\") pod \"47e826e6-870e-4d27-afbe-fe1547e39e61\" (UID: \"47e826e6-870e-4d27-afbe-fe1547e39e61\") " Mar 14 06:16:04 crc kubenswrapper[4713]: I0314 06:16:04.702098 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv" (OuterVolumeSpecName: "kube-api-access-b9xmv") pod "47e826e6-870e-4d27-afbe-fe1547e39e61" (UID: "47e826e6-870e-4d27-afbe-fe1547e39e61"). InnerVolumeSpecName "kube-api-access-b9xmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:16:04 crc kubenswrapper[4713]: I0314 06:16:04.798749 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9xmv\" (UniqueName: \"kubernetes.io/projected/47e826e6-870e-4d27-afbe-fe1547e39e61-kube-api-access-b9xmv\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:05 crc kubenswrapper[4713]: I0314 06:16:05.081071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" event={"ID":"47e826e6-870e-4d27-afbe-fe1547e39e61","Type":"ContainerDied","Data":"3f7eefc6e407029cad6ef2e2d06c185adffbfbd9d8c66c03e3faabc251b42185"} Mar 14 06:16:05 crc kubenswrapper[4713]: I0314 06:16:05.081449 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7eefc6e407029cad6ef2e2d06c185adffbfbd9d8c66c03e3faabc251b42185" Mar 14 06:16:05 crc kubenswrapper[4713]: I0314 06:16:05.081544 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557816-z8vqm" Mar 14 06:16:05 crc kubenswrapper[4713]: I0314 06:16:05.577901 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557810-xpw56"] Mar 14 06:16:05 crc kubenswrapper[4713]: I0314 06:16:05.586873 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557810-xpw56"] Mar 14 06:16:07 crc kubenswrapper[4713]: I0314 06:16:07.577839 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae550f4-6433-41cd-a287-1e2e6ee25c55" path="/var/lib/kubelet/pods/dae550f4-6433-41cd-a287-1e2e6ee25c55/volumes" Mar 14 06:16:10 crc kubenswrapper[4713]: I0314 06:16:10.731197 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:16:10 crc kubenswrapper[4713]: I0314 06:16:10.731829 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:16:10 crc kubenswrapper[4713]: I0314 06:16:10.731914 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:16:10 crc kubenswrapper[4713]: I0314 06:16:10.732996 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:16:10 crc kubenswrapper[4713]: I0314 06:16:10.733070 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7" gracePeriod=600 Mar 14 06:16:11 crc kubenswrapper[4713]: I0314 06:16:11.180431 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7" exitCode=0 Mar 14 06:16:11 crc kubenswrapper[4713]: I0314 06:16:11.180499 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7"} Mar 14 06:16:11 crc kubenswrapper[4713]: I0314 06:16:11.180847 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160"} Mar 14 06:16:11 crc kubenswrapper[4713]: I0314 06:16:11.180875 4713 scope.go:117] "RemoveContainer" containerID="6125b3f374732f62d952f7c70ae4f0f62223d93200f7ca4a859c0d32404132df" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.559290 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:16:41 crc kubenswrapper[4713]: E0314 06:16:41.561594 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e826e6-870e-4d27-afbe-fe1547e39e61" containerName="oc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.561633 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e826e6-870e-4d27-afbe-fe1547e39e61" containerName="oc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.562138 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e826e6-870e-4d27-afbe-fe1547e39e61" containerName="oc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.565770 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.598594 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.738079 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.738269 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.738349 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcks2\" (UniqueName: \"kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.840434 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.840543 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcks2\" (UniqueName: \"kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.840768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.841377 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.841645 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.861769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcks2\" (UniqueName: \"kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2\") pod \"certified-operators-nz5qc\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:41 crc kubenswrapper[4713]: I0314 06:16:41.900010 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:16:42 crc kubenswrapper[4713]: I0314 06:16:42.504244 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:16:42 crc kubenswrapper[4713]: I0314 06:16:42.562670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerStarted","Data":"2d1e9fa9fa0d4db887074a00e035e68c8c721df254a70cc587288d4a5710e527"} Mar 14 06:16:43 crc kubenswrapper[4713]: I0314 06:16:43.579110 4713 generic.go:334] "Generic (PLEG): container finished" podID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerID="16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200" exitCode=0 Mar 14 06:16:43 crc kubenswrapper[4713]: I0314 06:16:43.581678 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerDied","Data":"16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200"} Mar 14 06:16:44 crc kubenswrapper[4713]: I0314 06:16:44.591648 4713 generic.go:334] "Generic (PLEG): container finished" podID="713308d3-fe7b-40f0-84b6-671a2defaf7b" containerID="4c0b31cb1ddcca145d51f75717a28454762730a7c0891e353b2f75ad53933c6b" exitCode=0 Mar 14 06:16:44 crc kubenswrapper[4713]: I0314 06:16:44.591692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" event={"ID":"713308d3-fe7b-40f0-84b6-671a2defaf7b","Type":"ContainerDied","Data":"4c0b31cb1ddcca145d51f75717a28454762730a7c0891e353b2f75ad53933c6b"} Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.099547 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.270686 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271168 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271308 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271482 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271580 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271690 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271831 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.271978 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snblw\" (UniqueName: \"kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.272077 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.272251 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle\") pod \"713308d3-fe7b-40f0-84b6-671a2defaf7b\" (UID: \"713308d3-fe7b-40f0-84b6-671a2defaf7b\") " Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.277844 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.284497 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw" (OuterVolumeSpecName: "kube-api-access-snblw") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "kube-api-access-snblw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.307159 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.310387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.315710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.322387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.326794 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory" (OuterVolumeSpecName: "inventory") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.329168 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.339277 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.339295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.346555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "713308d3-fe7b-40f0-84b6-671a2defaf7b" (UID: "713308d3-fe7b-40f0-84b6-671a2defaf7b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375479 4713 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375521 4713 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375537 4713 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375551 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375564 4713 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375578 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375591 4713 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375604 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snblw\" (UniqueName: \"kubernetes.io/projected/713308d3-fe7b-40f0-84b6-671a2defaf7b-kube-api-access-snblw\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375619 4713 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375631 4713 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.375643 4713 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/713308d3-fe7b-40f0-84b6-671a2defaf7b-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.614679 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" event={"ID":"713308d3-fe7b-40f0-84b6-671a2defaf7b","Type":"ContainerDied","Data":"190db2e20f7213d3e5c52910ce99ac28e68335294a81a6b977679b4f3168dd79"} Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.614723 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="190db2e20f7213d3e5c52910ce99ac28e68335294a81a6b977679b4f3168dd79" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.614782 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4tcw" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.622566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerStarted","Data":"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a"} Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.735110 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2"] Mar 14 06:16:46 crc kubenswrapper[4713]: E0314 06:16:46.735816 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713308d3-fe7b-40f0-84b6-671a2defaf7b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.735844 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="713308d3-fe7b-40f0-84b6-671a2defaf7b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.736129 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="713308d3-fe7b-40f0-84b6-671a2defaf7b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.738685 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.742986 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.743067 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.743080 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.743179 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.747094 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.752313 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2"] Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.903919 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.904524 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.904734 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.904887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.905013 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcgnd\" (UniqueName: \"kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.905151 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:46 crc kubenswrapper[4713]: I0314 06:16:46.905323 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007634 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007725 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007783 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007866 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcgnd\" (UniqueName: \"kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007940 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.007997 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.012195 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.012713 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.012763 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.012918 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.014030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.024558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.034916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcgnd\" (UniqueName: \"kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-przw2\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.072189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:16:47 crc kubenswrapper[4713]: I0314 06:16:47.728943 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2"] Mar 14 06:16:48 crc kubenswrapper[4713]: I0314 06:16:48.642987 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" event={"ID":"b1d72cf9-f971-476d-a917-bb56b1280ac0","Type":"ContainerStarted","Data":"980b3d9cff835f07a05c050d9f96adfd704f053ff00de130c15561082e320ea1"} Mar 14 06:16:50 crc kubenswrapper[4713]: I0314 06:16:50.663966 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" event={"ID":"b1d72cf9-f971-476d-a917-bb56b1280ac0","Type":"ContainerStarted","Data":"0af38c909c52f7fc78e6c32eb77a191ba04e0dc7cb915095b9c5f54267e15441"} Mar 14 06:16:50 crc kubenswrapper[4713]: I0314 06:16:50.685867 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" podStartSLOduration=2.464344821 podStartE2EDuration="4.685850051s" podCreationTimestamp="2026-03-14 06:16:46 +0000 UTC" firstStartedPulling="2026-03-14 06:16:47.726868178 +0000 UTC m=+2990.814777468" lastFinishedPulling="2026-03-14 06:16:49.948373398 +0000 UTC m=+2993.036282698" observedRunningTime="2026-03-14 06:16:50.682990284 +0000 UTC m=+2993.770899604" watchObservedRunningTime="2026-03-14 06:16:50.685850051 +0000 UTC m=+2993.773759351" Mar 14 06:16:51 crc kubenswrapper[4713]: I0314 06:16:51.001418 4713 scope.go:117] "RemoveContainer" containerID="5ef792da3e4f3d01adec9b047aa6808e639fdd64b16f60f81406ffcc3ea78aac" Mar 14 06:16:52 crc kubenswrapper[4713]: I0314 06:16:52.685876 4713 generic.go:334] "Generic (PLEG): container finished" podID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerID="13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a" exitCode=0 Mar 14 06:16:52 crc kubenswrapper[4713]: I0314 06:16:52.685920 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerDied","Data":"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a"} Mar 14 06:16:54 crc kubenswrapper[4713]: I0314 06:16:54.709167 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerStarted","Data":"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e"} Mar 14 06:16:54 crc kubenswrapper[4713]: I0314 06:16:54.756693 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nz5qc" podStartSLOduration=3.696624588 podStartE2EDuration="13.756669757s" podCreationTimestamp="2026-03-14 06:16:41 +0000 UTC" firstStartedPulling="2026-03-14 06:16:43.581195525 +0000 UTC m=+2986.669104845" lastFinishedPulling="2026-03-14 06:16:53.641240714 +0000 UTC m=+2996.729150014" observedRunningTime="2026-03-14 06:16:54.744641771 +0000 UTC m=+2997.832551071" watchObservedRunningTime="2026-03-14 06:16:54.756669757 +0000 UTC m=+2997.844579067" Mar 14 06:17:01 crc kubenswrapper[4713]: I0314 06:17:01.901306 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:01 crc kubenswrapper[4713]: I0314 06:17:01.901855 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:01 crc kubenswrapper[4713]: I0314 06:17:01.947562 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:02 crc kubenswrapper[4713]: I0314 06:17:02.871639 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:02 crc kubenswrapper[4713]: I0314 06:17:02.925323 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:17:04 crc kubenswrapper[4713]: I0314 06:17:04.843999 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nz5qc" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="registry-server" containerID="cri-o://ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e" gracePeriod=2 Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.420613 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.539326 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcks2\" (UniqueName: \"kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2\") pod \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.539425 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities\") pod \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.539687 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content\") pod \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\" (UID: \"7b9f3422-a9bc-4615-8da6-a6625e3329e4\") " Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.540573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities" (OuterVolumeSpecName: "utilities") pod "7b9f3422-a9bc-4615-8da6-a6625e3329e4" (UID: "7b9f3422-a9bc-4615-8da6-a6625e3329e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.545065 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2" (OuterVolumeSpecName: "kube-api-access-jcks2") pod "7b9f3422-a9bc-4615-8da6-a6625e3329e4" (UID: "7b9f3422-a9bc-4615-8da6-a6625e3329e4"). InnerVolumeSpecName "kube-api-access-jcks2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.600091 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b9f3422-a9bc-4615-8da6-a6625e3329e4" (UID: "7b9f3422-a9bc-4615-8da6-a6625e3329e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.647097 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcks2\" (UniqueName: \"kubernetes.io/projected/7b9f3422-a9bc-4615-8da6-a6625e3329e4-kube-api-access-jcks2\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.647133 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.647198 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9f3422-a9bc-4615-8da6-a6625e3329e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.858695 4713 generic.go:334] "Generic (PLEG): container finished" podID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerID="ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e" exitCode=0 Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.858748 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerDied","Data":"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e"} Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.858778 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz5qc" event={"ID":"7b9f3422-a9bc-4615-8da6-a6625e3329e4","Type":"ContainerDied","Data":"2d1e9fa9fa0d4db887074a00e035e68c8c721df254a70cc587288d4a5710e527"} Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.858784 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz5qc" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.858794 4713 scope.go:117] "RemoveContainer" containerID="ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.889590 4713 scope.go:117] "RemoveContainer" containerID="13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.896350 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.906560 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nz5qc"] Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.911775 4713 scope.go:117] "RemoveContainer" containerID="16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.971858 4713 scope.go:117] "RemoveContainer" containerID="ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e" Mar 14 06:17:05 crc kubenswrapper[4713]: E0314 06:17:05.972359 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e\": container with ID starting with ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e not found: ID does not exist" containerID="ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.972392 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e"} err="failed to get container status \"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e\": rpc error: code = NotFound desc = could not find container \"ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e\": container with ID starting with ecad1028bc3b3ec7ccfb40949406de50f6cfa9dcfcda3b5632f90cb30bbe018e not found: ID does not exist" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.972414 4713 scope.go:117] "RemoveContainer" containerID="13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a" Mar 14 06:17:05 crc kubenswrapper[4713]: E0314 06:17:05.972739 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a\": container with ID starting with 13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a not found: ID does not exist" containerID="13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.972763 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a"} err="failed to get container status \"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a\": rpc error: code = NotFound desc = could not find container \"13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a\": container with ID starting with 13c41add852db2fd9111ec48fbb6b3a3fcd4b5a70bdb7210987ddb8597d8667a not found: ID does not exist" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.972775 4713 scope.go:117] "RemoveContainer" containerID="16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200" Mar 14 06:17:05 crc kubenswrapper[4713]: E0314 06:17:05.973003 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200\": container with ID starting with 16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200 not found: ID does not exist" containerID="16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200" Mar 14 06:17:05 crc kubenswrapper[4713]: I0314 06:17:05.973030 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200"} err="failed to get container status \"16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200\": rpc error: code = NotFound desc = could not find container \"16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200\": container with ID starting with 16e64e039a9fb551d8481e02a51786b80629026dc971a2a112f786a8d5290200 not found: ID does not exist" Mar 14 06:17:07 crc kubenswrapper[4713]: I0314 06:17:07.577554 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" path="/var/lib/kubelet/pods/7b9f3422-a9bc-4615-8da6-a6625e3329e4/volumes" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.232938 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:18 crc kubenswrapper[4713]: E0314 06:17:18.234243 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="extract-utilities" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.234265 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="extract-utilities" Mar 14 06:17:18 crc kubenswrapper[4713]: E0314 06:17:18.234283 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="registry-server" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.234291 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="registry-server" Mar 14 06:17:18 crc kubenswrapper[4713]: E0314 06:17:18.234350 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="extract-content" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.234359 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="extract-content" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.234655 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9f3422-a9bc-4615-8da6-a6625e3329e4" containerName="registry-server" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.236970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.246431 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.334991 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.335603 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.335937 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r79tp\" (UniqueName: \"kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.439000 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r79tp\" (UniqueName: \"kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.439132 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.439357 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.439868 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.439954 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.459017 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r79tp\" (UniqueName: \"kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp\") pod \"community-operators-shf7t\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:18 crc kubenswrapper[4713]: I0314 06:17:18.563886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:19 crc kubenswrapper[4713]: I0314 06:17:19.151839 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:20 crc kubenswrapper[4713]: I0314 06:17:20.015657 4713 generic.go:334] "Generic (PLEG): container finished" podID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerID="c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b" exitCode=0 Mar 14 06:17:20 crc kubenswrapper[4713]: I0314 06:17:20.015733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerDied","Data":"c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b"} Mar 14 06:17:20 crc kubenswrapper[4713]: I0314 06:17:20.016198 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerStarted","Data":"c90386682554065379eccfa7ceb3b4d9add86faebec406a09ed3222c1a1c1a09"} Mar 14 06:17:22 crc kubenswrapper[4713]: I0314 06:17:22.041256 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerStarted","Data":"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8"} Mar 14 06:17:24 crc kubenswrapper[4713]: I0314 06:17:24.063420 4713 generic.go:334] "Generic (PLEG): container finished" podID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerID="359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8" exitCode=0 Mar 14 06:17:24 crc kubenswrapper[4713]: I0314 06:17:24.063510 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerDied","Data":"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8"} Mar 14 06:17:25 crc kubenswrapper[4713]: I0314 06:17:25.080091 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerStarted","Data":"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3"} Mar 14 06:17:28 crc kubenswrapper[4713]: I0314 06:17:28.564260 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:28 crc kubenswrapper[4713]: I0314 06:17:28.565183 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:28 crc kubenswrapper[4713]: I0314 06:17:28.639073 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:28 crc kubenswrapper[4713]: I0314 06:17:28.665490 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-shf7t" podStartSLOduration=6.050047634 podStartE2EDuration="10.665470429s" podCreationTimestamp="2026-03-14 06:17:18 +0000 UTC" firstStartedPulling="2026-03-14 06:17:20.017823331 +0000 UTC m=+3023.105732631" lastFinishedPulling="2026-03-14 06:17:24.633246116 +0000 UTC m=+3027.721155426" observedRunningTime="2026-03-14 06:17:25.121644505 +0000 UTC m=+3028.209553805" watchObservedRunningTime="2026-03-14 06:17:28.665470429 +0000 UTC m=+3031.753379729" Mar 14 06:17:29 crc kubenswrapper[4713]: I0314 06:17:29.173196 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:29 crc kubenswrapper[4713]: I0314 06:17:29.246614 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.146270 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-shf7t" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="registry-server" containerID="cri-o://85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3" gracePeriod=2 Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.730741 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.802622 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content\") pod \"c599c0c8-d331-4499-be56-07e579b6bf2a\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.802756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities\") pod \"c599c0c8-d331-4499-be56-07e579b6bf2a\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.802888 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r79tp\" (UniqueName: \"kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp\") pod \"c599c0c8-d331-4499-be56-07e579b6bf2a\" (UID: \"c599c0c8-d331-4499-be56-07e579b6bf2a\") " Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.803505 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities" (OuterVolumeSpecName: "utilities") pod "c599c0c8-d331-4499-be56-07e579b6bf2a" (UID: "c599c0c8-d331-4499-be56-07e579b6bf2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.804407 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.812617 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp" (OuterVolumeSpecName: "kube-api-access-r79tp") pod "c599c0c8-d331-4499-be56-07e579b6bf2a" (UID: "c599c0c8-d331-4499-be56-07e579b6bf2a"). InnerVolumeSpecName "kube-api-access-r79tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.856986 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c599c0c8-d331-4499-be56-07e579b6bf2a" (UID: "c599c0c8-d331-4499-be56-07e579b6bf2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.906566 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r79tp\" (UniqueName: \"kubernetes.io/projected/c599c0c8-d331-4499-be56-07e579b6bf2a-kube-api-access-r79tp\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:31 crc kubenswrapper[4713]: I0314 06:17:31.906610 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c599c0c8-d331-4499-be56-07e579b6bf2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.161478 4713 generic.go:334] "Generic (PLEG): container finished" podID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerID="85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3" exitCode=0 Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.161535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerDied","Data":"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3"} Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.161569 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shf7t" event={"ID":"c599c0c8-d331-4499-be56-07e579b6bf2a","Type":"ContainerDied","Data":"c90386682554065379eccfa7ceb3b4d9add86faebec406a09ed3222c1a1c1a09"} Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.161579 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shf7t" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.161606 4713 scope.go:117] "RemoveContainer" containerID="85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.189159 4713 scope.go:117] "RemoveContainer" containerID="359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.219269 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.234302 4713 scope.go:117] "RemoveContainer" containerID="c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.235441 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-shf7t"] Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.286615 4713 scope.go:117] "RemoveContainer" containerID="85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3" Mar 14 06:17:32 crc kubenswrapper[4713]: E0314 06:17:32.287056 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3\": container with ID starting with 85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3 not found: ID does not exist" containerID="85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.287092 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3"} err="failed to get container status \"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3\": rpc error: code = NotFound desc = could not find container \"85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3\": container with ID starting with 85f36efdfd837370853c92ff01d0f2257644983cee273e1ff4c70931b9351ab3 not found: ID does not exist" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.287120 4713 scope.go:117] "RemoveContainer" containerID="359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8" Mar 14 06:17:32 crc kubenswrapper[4713]: E0314 06:17:32.287491 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8\": container with ID starting with 359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8 not found: ID does not exist" containerID="359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.287535 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8"} err="failed to get container status \"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8\": rpc error: code = NotFound desc = could not find container \"359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8\": container with ID starting with 359a1c189cad621166078fc5cdfd451551f3876b936c8278160cd304ee192cc8 not found: ID does not exist" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.287564 4713 scope.go:117] "RemoveContainer" containerID="c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b" Mar 14 06:17:32 crc kubenswrapper[4713]: E0314 06:17:32.287881 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b\": container with ID starting with c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b not found: ID does not exist" containerID="c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b" Mar 14 06:17:32 crc kubenswrapper[4713]: I0314 06:17:32.288001 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b"} err="failed to get container status \"c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b\": rpc error: code = NotFound desc = could not find container \"c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b\": container with ID starting with c711bb5f5d0870cb476703531c9d12420df9779e4a2010838050a67bb9698a8b not found: ID does not exist" Mar 14 06:17:33 crc kubenswrapper[4713]: I0314 06:17:33.579880 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" path="/var/lib/kubelet/pods/c599c0c8-d331-4499-be56-07e579b6bf2a/volumes" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.143319 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557818-c9p75"] Mar 14 06:18:00 crc kubenswrapper[4713]: E0314 06:18:00.144345 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="extract-utilities" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.144361 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="extract-utilities" Mar 14 06:18:00 crc kubenswrapper[4713]: E0314 06:18:00.144392 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="registry-server" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.144398 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="registry-server" Mar 14 06:18:00 crc kubenswrapper[4713]: E0314 06:18:00.144411 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="extract-content" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.144418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="extract-content" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.144656 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c599c0c8-d331-4499-be56-07e579b6bf2a" containerName="registry-server" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.145553 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.147993 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.148003 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.148019 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.158439 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557818-c9p75"] Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.198228 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8nf\" (UniqueName: \"kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf\") pod \"auto-csr-approver-29557818-c9p75\" (UID: \"c5384776-a77c-47d9-b97c-ac482b54cf84\") " pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.299500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8nf\" (UniqueName: \"kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf\") pod \"auto-csr-approver-29557818-c9p75\" (UID: \"c5384776-a77c-47d9-b97c-ac482b54cf84\") " pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.319046 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8nf\" (UniqueName: \"kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf\") pod \"auto-csr-approver-29557818-c9p75\" (UID: \"c5384776-a77c-47d9-b97c-ac482b54cf84\") " pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.466581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.976940 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:18:00 crc kubenswrapper[4713]: I0314 06:18:00.978639 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557818-c9p75"] Mar 14 06:18:01 crc kubenswrapper[4713]: I0314 06:18:01.536811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557818-c9p75" event={"ID":"c5384776-a77c-47d9-b97c-ac482b54cf84","Type":"ContainerStarted","Data":"53a09f1c3899272f399e895a39636ecfce4a63d6a072d3b9c15e9fb6918fd6f2"} Mar 14 06:18:02 crc kubenswrapper[4713]: I0314 06:18:02.549347 4713 generic.go:334] "Generic (PLEG): container finished" podID="c5384776-a77c-47d9-b97c-ac482b54cf84" containerID="6bb1b9db7aae4d473df5b59169a640e46a43ca8e781b56f427a1c60fe8e7eeb1" exitCode=0 Mar 14 06:18:02 crc kubenswrapper[4713]: I0314 06:18:02.549415 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557818-c9p75" event={"ID":"c5384776-a77c-47d9-b97c-ac482b54cf84","Type":"ContainerDied","Data":"6bb1b9db7aae4d473df5b59169a640e46a43ca8e781b56f427a1c60fe8e7eeb1"} Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.010380 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.193990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc8nf\" (UniqueName: \"kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf\") pod \"c5384776-a77c-47d9-b97c-ac482b54cf84\" (UID: \"c5384776-a77c-47d9-b97c-ac482b54cf84\") " Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.202494 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf" (OuterVolumeSpecName: "kube-api-access-hc8nf") pod "c5384776-a77c-47d9-b97c-ac482b54cf84" (UID: "c5384776-a77c-47d9-b97c-ac482b54cf84"). InnerVolumeSpecName "kube-api-access-hc8nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.297409 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc8nf\" (UniqueName: \"kubernetes.io/projected/c5384776-a77c-47d9-b97c-ac482b54cf84-kube-api-access-hc8nf\") on node \"crc\" DevicePath \"\"" Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.580407 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557818-c9p75" event={"ID":"c5384776-a77c-47d9-b97c-ac482b54cf84","Type":"ContainerDied","Data":"53a09f1c3899272f399e895a39636ecfce4a63d6a072d3b9c15e9fb6918fd6f2"} Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.580456 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a09f1c3899272f399e895a39636ecfce4a63d6a072d3b9c15e9fb6918fd6f2" Mar 14 06:18:04 crc kubenswrapper[4713]: I0314 06:18:04.580546 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557818-c9p75" Mar 14 06:18:05 crc kubenswrapper[4713]: I0314 06:18:05.107758 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557812-2bxlg"] Mar 14 06:18:05 crc kubenswrapper[4713]: I0314 06:18:05.117745 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557812-2bxlg"] Mar 14 06:18:05 crc kubenswrapper[4713]: I0314 06:18:05.580842 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a261a9d-d943-4ef6-a98d-5674ac862f1b" path="/var/lib/kubelet/pods/7a261a9d-d943-4ef6-a98d-5674ac862f1b/volumes" Mar 14 06:18:13 crc kubenswrapper[4713]: I0314 06:18:13.543304 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:18:13 crc kubenswrapper[4713]: I0314 06:18:13.543978 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:18:13 crc kubenswrapper[4713]: I0314 06:18:13.820926 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:18:13 crc kubenswrapper[4713]: I0314 06:18:13.821004 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:18:40 crc kubenswrapper[4713]: I0314 06:18:40.732032 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:18:40 crc kubenswrapper[4713]: I0314 06:18:40.732576 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:18:51 crc kubenswrapper[4713]: I0314 06:18:51.150387 4713 scope.go:117] "RemoveContainer" containerID="fb49f780a07db2887ed615d84ecde34d3e2fc213ccd783a57681949b70638d09" Mar 14 06:19:10 crc kubenswrapper[4713]: I0314 06:19:10.731734 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:19:10 crc kubenswrapper[4713]: I0314 06:19:10.732374 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:19:12 crc kubenswrapper[4713]: I0314 06:19:12.785358 4713 generic.go:334] "Generic (PLEG): container finished" podID="b1d72cf9-f971-476d-a917-bb56b1280ac0" containerID="0af38c909c52f7fc78e6c32eb77a191ba04e0dc7cb915095b9c5f54267e15441" exitCode=0 Mar 14 06:19:12 crc kubenswrapper[4713]: I0314 06:19:12.785492 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" event={"ID":"b1d72cf9-f971-476d-a917-bb56b1280ac0","Type":"ContainerDied","Data":"0af38c909c52f7fc78e6c32eb77a191ba04e0dc7cb915095b9c5f54267e15441"} Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.295498 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462855 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcgnd\" (UniqueName: \"kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462880 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462944 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.462969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.463466 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1\") pod \"b1d72cf9-f971-476d-a917-bb56b1280ac0\" (UID: \"b1d72cf9-f971-476d-a917-bb56b1280ac0\") " Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.476349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd" (OuterVolumeSpecName: "kube-api-access-mcgnd") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "kube-api-access-mcgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.481421 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.499331 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.501615 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.501642 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory" (OuterVolumeSpecName: "inventory") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.502023 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.526434 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b1d72cf9-f971-476d-a917-bb56b1280ac0" (UID: "b1d72cf9-f971-476d-a917-bb56b1280ac0"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.565929 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.565964 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.565977 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcgnd\" (UniqueName: \"kubernetes.io/projected/b1d72cf9-f971-476d-a917-bb56b1280ac0-kube-api-access-mcgnd\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.565985 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.565996 4713 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.566005 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.566014 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1d72cf9-f971-476d-a917-bb56b1280ac0-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.806493 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" event={"ID":"b1d72cf9-f971-476d-a917-bb56b1280ac0","Type":"ContainerDied","Data":"980b3d9cff835f07a05c050d9f96adfd704f053ff00de130c15561082e320ea1"} Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.806538 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-przw2" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.806546 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980b3d9cff835f07a05c050d9f96adfd704f053ff00de130c15561082e320ea1" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.916076 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh"] Mar 14 06:19:14 crc kubenswrapper[4713]: E0314 06:19:14.916609 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d72cf9-f971-476d-a917-bb56b1280ac0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.916625 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d72cf9-f971-476d-a917-bb56b1280ac0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 06:19:14 crc kubenswrapper[4713]: E0314 06:19:14.916641 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5384776-a77c-47d9-b97c-ac482b54cf84" containerName="oc" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.916648 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5384776-a77c-47d9-b97c-ac482b54cf84" containerName="oc" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.916928 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5384776-a77c-47d9-b97c-ac482b54cf84" containerName="oc" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.916948 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d72cf9-f971-476d-a917-bb56b1280ac0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.918090 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.922528 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.922960 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.923469 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.923508 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.923750 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 14 06:19:14 crc kubenswrapper[4713]: I0314 06:19:14.936018 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh"] Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079456 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079798 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.079889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.080023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2fj8\" (UniqueName: \"kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182270 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2fj8\" (UniqueName: \"kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182554 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182584 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182628 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.182647 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.187988 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.188030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.188299 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.188360 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.188581 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.193731 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.200913 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2fj8\" (UniqueName: \"kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.253731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:19:15 crc kubenswrapper[4713]: I0314 06:19:15.937640 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh"] Mar 14 06:19:16 crc kubenswrapper[4713]: I0314 06:19:16.828424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" event={"ID":"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95","Type":"ContainerStarted","Data":"9676c02c035bf86a8211504fa634b0f7d9bb7d276357e34e8a38005e62a986ec"} Mar 14 06:19:17 crc kubenswrapper[4713]: I0314 06:19:17.841721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" event={"ID":"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95","Type":"ContainerStarted","Data":"6eb2d45199aae16b9cd45e407777c5257c03a7aa45e96cd72fa10655083d01fe"} Mar 14 06:19:17 crc kubenswrapper[4713]: I0314 06:19:17.882722 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" podStartSLOduration=3.189153658 podStartE2EDuration="3.882687184s" podCreationTimestamp="2026-03-14 06:19:14 +0000 UTC" firstStartedPulling="2026-03-14 06:19:15.940300108 +0000 UTC m=+3139.028209408" lastFinishedPulling="2026-03-14 06:19:16.633833634 +0000 UTC m=+3139.721742934" observedRunningTime="2026-03-14 06:19:17.865981706 +0000 UTC m=+3140.953891006" watchObservedRunningTime="2026-03-14 06:19:17.882687184 +0000 UTC m=+3140.970596494" Mar 14 06:19:40 crc kubenswrapper[4713]: I0314 06:19:40.731578 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:19:40 crc kubenswrapper[4713]: I0314 06:19:40.732102 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:19:40 crc kubenswrapper[4713]: I0314 06:19:40.732149 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:19:40 crc kubenswrapper[4713]: I0314 06:19:40.733274 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:19:40 crc kubenswrapper[4713]: I0314 06:19:40.733385 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" gracePeriod=600 Mar 14 06:19:40 crc kubenswrapper[4713]: E0314 06:19:40.878755 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:19:41 crc kubenswrapper[4713]: I0314 06:19:41.096708 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" exitCode=0 Mar 14 06:19:41 crc kubenswrapper[4713]: I0314 06:19:41.096753 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160"} Mar 14 06:19:41 crc kubenswrapper[4713]: I0314 06:19:41.096788 4713 scope.go:117] "RemoveContainer" containerID="bc1c3f2721075751b0437b59794102fba21b72ff7f17b62a70d817f1d5adaee7" Mar 14 06:19:41 crc kubenswrapper[4713]: I0314 06:19:41.097515 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:19:41 crc kubenswrapper[4713]: E0314 06:19:41.097822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:19:52 crc kubenswrapper[4713]: I0314 06:19:52.563892 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:19:52 crc kubenswrapper[4713]: E0314 06:19:52.564779 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.146665 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557820-lqtpb"] Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.149937 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.164470 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557820-lqtpb"] Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.191966 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.192608 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.192806 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.262129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk75d\" (UniqueName: \"kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d\") pod \"auto-csr-approver-29557820-lqtpb\" (UID: \"ca598d50-9940-480e-b114-c4df47bd2bc5\") " pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.364353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk75d\" (UniqueName: \"kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d\") pod \"auto-csr-approver-29557820-lqtpb\" (UID: \"ca598d50-9940-480e-b114-c4df47bd2bc5\") " pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.397820 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk75d\" (UniqueName: \"kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d\") pod \"auto-csr-approver-29557820-lqtpb\" (UID: \"ca598d50-9940-480e-b114-c4df47bd2bc5\") " pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:00 crc kubenswrapper[4713]: I0314 06:20:00.510127 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:01 crc kubenswrapper[4713]: I0314 06:20:01.024790 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557820-lqtpb"] Mar 14 06:20:01 crc kubenswrapper[4713]: I0314 06:20:01.985118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" event={"ID":"ca598d50-9940-480e-b114-c4df47bd2bc5","Type":"ContainerStarted","Data":"1dbe8347fce4ad15cad9e0349f70113a726c17ceaa74bfc45917dde7590eda3b"} Mar 14 06:20:02 crc kubenswrapper[4713]: I0314 06:20:02.996880 4713 generic.go:334] "Generic (PLEG): container finished" podID="ca598d50-9940-480e-b114-c4df47bd2bc5" containerID="ccaf1261aeb7b6543c4a9dec5b2f8231a1663039f4204023b77c6c7f826ea488" exitCode=0 Mar 14 06:20:02 crc kubenswrapper[4713]: I0314 06:20:02.996958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" event={"ID":"ca598d50-9940-480e-b114-c4df47bd2bc5","Type":"ContainerDied","Data":"ccaf1261aeb7b6543c4a9dec5b2f8231a1663039f4204023b77c6c7f826ea488"} Mar 14 06:20:04 crc kubenswrapper[4713]: I0314 06:20:04.397517 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:04 crc kubenswrapper[4713]: I0314 06:20:04.474566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk75d\" (UniqueName: \"kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d\") pod \"ca598d50-9940-480e-b114-c4df47bd2bc5\" (UID: \"ca598d50-9940-480e-b114-c4df47bd2bc5\") " Mar 14 06:20:04 crc kubenswrapper[4713]: I0314 06:20:04.484587 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d" (OuterVolumeSpecName: "kube-api-access-kk75d") pod "ca598d50-9940-480e-b114-c4df47bd2bc5" (UID: "ca598d50-9940-480e-b114-c4df47bd2bc5"). InnerVolumeSpecName "kube-api-access-kk75d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:20:04 crc kubenswrapper[4713]: I0314 06:20:04.567197 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:20:04 crc kubenswrapper[4713]: E0314 06:20:04.568057 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:20:04 crc kubenswrapper[4713]: I0314 06:20:04.580589 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk75d\" (UniqueName: \"kubernetes.io/projected/ca598d50-9940-480e-b114-c4df47bd2bc5-kube-api-access-kk75d\") on node \"crc\" DevicePath \"\"" Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.019349 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" event={"ID":"ca598d50-9940-480e-b114-c4df47bd2bc5","Type":"ContainerDied","Data":"1dbe8347fce4ad15cad9e0349f70113a726c17ceaa74bfc45917dde7590eda3b"} Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.019407 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dbe8347fce4ad15cad9e0349f70113a726c17ceaa74bfc45917dde7590eda3b" Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.019448 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557820-lqtpb" Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.480039 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557814-kjzb5"] Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.495135 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557814-kjzb5"] Mar 14 06:20:05 crc kubenswrapper[4713]: I0314 06:20:05.578599 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93afc687-91a0-4ddb-8f39-9ce3099939e0" path="/var/lib/kubelet/pods/93afc687-91a0-4ddb-8f39-9ce3099939e0/volumes" Mar 14 06:20:17 crc kubenswrapper[4713]: I0314 06:20:17.742382 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:20:17 crc kubenswrapper[4713]: E0314 06:20:17.743232 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:20:31 crc kubenswrapper[4713]: I0314 06:20:31.567837 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:20:31 crc kubenswrapper[4713]: E0314 06:20:31.568779 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:20:42 crc kubenswrapper[4713]: I0314 06:20:42.564104 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:20:42 crc kubenswrapper[4713]: E0314 06:20:42.566879 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:20:51 crc kubenswrapper[4713]: I0314 06:20:51.488648 4713 scope.go:117] "RemoveContainer" containerID="3ac76fdc72804ef071b957dded3f05ae659d5e0871949d310baf190732c7e4f1" Mar 14 06:20:54 crc kubenswrapper[4713]: I0314 06:20:54.563751 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:20:54 crc kubenswrapper[4713]: E0314 06:20:54.564571 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:21:06 crc kubenswrapper[4713]: I0314 06:21:06.563725 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:21:06 crc kubenswrapper[4713]: E0314 06:21:06.564621 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:21:19 crc kubenswrapper[4713]: I0314 06:21:19.563963 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:21:19 crc kubenswrapper[4713]: E0314 06:21:19.567395 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:21:22 crc kubenswrapper[4713]: I0314 06:21:22.991709 4713 generic.go:334] "Generic (PLEG): container finished" podID="f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" containerID="6eb2d45199aae16b9cd45e407777c5257c03a7aa45e96cd72fa10655083d01fe" exitCode=0 Mar 14 06:21:22 crc kubenswrapper[4713]: I0314 06:21:22.991921 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" event={"ID":"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95","Type":"ContainerDied","Data":"6eb2d45199aae16b9cd45e407777c5257c03a7aa45e96cd72fa10655083d01fe"} Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.472407 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596574 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2fj8\" (UniqueName: \"kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596611 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596699 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596769 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596828 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.596872 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1\") pod \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\" (UID: \"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95\") " Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.602960 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8" (OuterVolumeSpecName: "kube-api-access-l2fj8") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "kube-api-access-l2fj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.606749 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.628867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.630542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory" (OuterVolumeSpecName: "inventory") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.632909 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.637616 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.646586 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" (UID: "f7835d4f-7f3b-4b5f-8a3f-b950ec203b95"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700481 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700523 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700533 4713 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700542 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700553 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2fj8\" (UniqueName: \"kubernetes.io/projected/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-kube-api-access-l2fj8\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700562 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:24 crc kubenswrapper[4713]: I0314 06:21:24.700571 4713 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7835d4f-7f3b-4b5f-8a3f-b950ec203b95-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.015871 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" event={"ID":"f7835d4f-7f3b-4b5f-8a3f-b950ec203b95","Type":"ContainerDied","Data":"9676c02c035bf86a8211504fa634b0f7d9bb7d276357e34e8a38005e62a986ec"} Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.016285 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9676c02c035bf86a8211504fa634b0f7d9bb7d276357e34e8a38005e62a986ec" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.015945 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.135711 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp"] Mar 14 06:21:25 crc kubenswrapper[4713]: E0314 06:21:25.136437 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca598d50-9940-480e-b114-c4df47bd2bc5" containerName="oc" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.136460 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca598d50-9940-480e-b114-c4df47bd2bc5" containerName="oc" Mar 14 06:21:25 crc kubenswrapper[4713]: E0314 06:21:25.136495 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.136504 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.136761 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7835d4f-7f3b-4b5f-8a3f-b950ec203b95" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.136785 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca598d50-9940-480e-b114-c4df47bd2bc5" containerName="oc" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.137858 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.142413 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.142486 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.142982 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.143367 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vrhh6" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.146675 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.154502 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp"] Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.316095 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9z7h\" (UniqueName: \"kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.316178 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.316431 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.316699 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.316774 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.419189 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.419345 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.419411 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.419460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.419642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9z7h\" (UniqueName: \"kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.425907 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.426295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.433877 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.439363 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.443028 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9z7h\" (UniqueName: \"kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h\") pod \"logging-edpm-deployment-openstack-edpm-ipam-9mchp\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:25 crc kubenswrapper[4713]: I0314 06:21:25.472460 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:26 crc kubenswrapper[4713]: I0314 06:21:26.047814 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp"] Mar 14 06:21:27 crc kubenswrapper[4713]: I0314 06:21:27.037381 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" event={"ID":"63e52146-8f23-43ce-99dd-91c5c9f5b42d","Type":"ContainerStarted","Data":"3331cf025eb15955bb449af7a72cabda4982493691407adb0517d344307f0d49"} Mar 14 06:21:28 crc kubenswrapper[4713]: I0314 06:21:28.075883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" event={"ID":"63e52146-8f23-43ce-99dd-91c5c9f5b42d","Type":"ContainerStarted","Data":"2b7336d878fe01d2f473284116c6298bf9932d91360f27d6f00e6cc5a1677fea"} Mar 14 06:21:28 crc kubenswrapper[4713]: I0314 06:21:28.098424 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" podStartSLOduration=2.164821458 podStartE2EDuration="3.098404233s" podCreationTimestamp="2026-03-14 06:21:25 +0000 UTC" firstStartedPulling="2026-03-14 06:21:26.04755153 +0000 UTC m=+3269.135460830" lastFinishedPulling="2026-03-14 06:21:26.981134305 +0000 UTC m=+3270.069043605" observedRunningTime="2026-03-14 06:21:28.096781184 +0000 UTC m=+3271.184690484" watchObservedRunningTime="2026-03-14 06:21:28.098404233 +0000 UTC m=+3271.186313533" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.032890 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.035515 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.051232 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.139592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xd5\" (UniqueName: \"kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.139705 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.139732 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.242003 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xd5\" (UniqueName: \"kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.242190 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.242242 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.242735 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.242772 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.263857 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xd5\" (UniqueName: \"kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5\") pod \"redhat-marketplace-rkbfc\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.366275 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:30 crc kubenswrapper[4713]: I0314 06:21:30.960850 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:31 crc kubenswrapper[4713]: I0314 06:21:31.123638 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerStarted","Data":"ea05736478a468492701a824fea558f0cfe53fac264b8ab5b9a36962ab1528ef"} Mar 14 06:21:32 crc kubenswrapper[4713]: I0314 06:21:32.136475 4713 generic.go:334] "Generic (PLEG): container finished" podID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerID="1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69" exitCode=0 Mar 14 06:21:32 crc kubenswrapper[4713]: I0314 06:21:32.136797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerDied","Data":"1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69"} Mar 14 06:21:33 crc kubenswrapper[4713]: I0314 06:21:33.148609 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerStarted","Data":"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525"} Mar 14 06:21:33 crc kubenswrapper[4713]: I0314 06:21:33.563854 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:21:33 crc kubenswrapper[4713]: E0314 06:21:33.564941 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:21:34 crc kubenswrapper[4713]: I0314 06:21:34.160041 4713 generic.go:334] "Generic (PLEG): container finished" podID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerID="3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525" exitCode=0 Mar 14 06:21:34 crc kubenswrapper[4713]: I0314 06:21:34.160089 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerDied","Data":"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525"} Mar 14 06:21:35 crc kubenswrapper[4713]: I0314 06:21:35.171365 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerStarted","Data":"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb"} Mar 14 06:21:35 crc kubenswrapper[4713]: I0314 06:21:35.199398 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkbfc" podStartSLOduration=3.62149244 podStartE2EDuration="6.199383107s" podCreationTimestamp="2026-03-14 06:21:29 +0000 UTC" firstStartedPulling="2026-03-14 06:21:32.139157076 +0000 UTC m=+3275.227066386" lastFinishedPulling="2026-03-14 06:21:34.717047743 +0000 UTC m=+3277.804957053" observedRunningTime="2026-03-14 06:21:35.197296083 +0000 UTC m=+3278.285205383" watchObservedRunningTime="2026-03-14 06:21:35.199383107 +0000 UTC m=+3278.287292407" Mar 14 06:21:40 crc kubenswrapper[4713]: I0314 06:21:40.368081 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:40 crc kubenswrapper[4713]: I0314 06:21:40.368390 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:40 crc kubenswrapper[4713]: I0314 06:21:40.423436 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:41 crc kubenswrapper[4713]: I0314 06:21:41.295793 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:41 crc kubenswrapper[4713]: I0314 06:21:41.362976 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.262309 4713 generic.go:334] "Generic (PLEG): container finished" podID="63e52146-8f23-43ce-99dd-91c5c9f5b42d" containerID="2b7336d878fe01d2f473284116c6298bf9932d91360f27d6f00e6cc5a1677fea" exitCode=0 Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.262414 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" event={"ID":"63e52146-8f23-43ce-99dd-91c5c9f5b42d","Type":"ContainerDied","Data":"2b7336d878fe01d2f473284116c6298bf9932d91360f27d6f00e6cc5a1677fea"} Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.262954 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkbfc" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="registry-server" containerID="cri-o://880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb" gracePeriod=2 Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.802870 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.906989 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9xd5\" (UniqueName: \"kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5\") pod \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.907510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content\") pod \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.907557 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities\") pod \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\" (UID: \"0e65f5f3-80c6-4ed1-b19a-f312862faacd\") " Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.908371 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities" (OuterVolumeSpecName: "utilities") pod "0e65f5f3-80c6-4ed1-b19a-f312862faacd" (UID: "0e65f5f3-80c6-4ed1-b19a-f312862faacd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.922440 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5" (OuterVolumeSpecName: "kube-api-access-g9xd5") pod "0e65f5f3-80c6-4ed1-b19a-f312862faacd" (UID: "0e65f5f3-80c6-4ed1-b19a-f312862faacd"). InnerVolumeSpecName "kube-api-access-g9xd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:21:43 crc kubenswrapper[4713]: I0314 06:21:43.935605 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e65f5f3-80c6-4ed1-b19a-f312862faacd" (UID: "0e65f5f3-80c6-4ed1-b19a-f312862faacd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.011267 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.011317 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e65f5f3-80c6-4ed1-b19a-f312862faacd-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.011332 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9xd5\" (UniqueName: \"kubernetes.io/projected/0e65f5f3-80c6-4ed1-b19a-f312862faacd-kube-api-access-g9xd5\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.274886 4713 generic.go:334] "Generic (PLEG): container finished" podID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerID="880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb" exitCode=0 Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.275121 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkbfc" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.275762 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerDied","Data":"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb"} Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.275810 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkbfc" event={"ID":"0e65f5f3-80c6-4ed1-b19a-f312862faacd","Type":"ContainerDied","Data":"ea05736478a468492701a824fea558f0cfe53fac264b8ab5b9a36962ab1528ef"} Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.275826 4713 scope.go:117] "RemoveContainer" containerID="880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.300033 4713 scope.go:117] "RemoveContainer" containerID="3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.320931 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.338323 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkbfc"] Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.347197 4713 scope.go:117] "RemoveContainer" containerID="1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.395054 4713 scope.go:117] "RemoveContainer" containerID="880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb" Mar 14 06:21:44 crc kubenswrapper[4713]: E0314 06:21:44.395871 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb\": container with ID starting with 880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb not found: ID does not exist" containerID="880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.395905 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb"} err="failed to get container status \"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb\": rpc error: code = NotFound desc = could not find container \"880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb\": container with ID starting with 880687be13a71974b1158c7dd0bfab8e0e9482b98b53b84b2066f59bd87bfbcb not found: ID does not exist" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.395927 4713 scope.go:117] "RemoveContainer" containerID="3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525" Mar 14 06:21:44 crc kubenswrapper[4713]: E0314 06:21:44.396398 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525\": container with ID starting with 3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525 not found: ID does not exist" containerID="3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.396425 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525"} err="failed to get container status \"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525\": rpc error: code = NotFound desc = could not find container \"3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525\": container with ID starting with 3028ec13712661526484819e6a21c1b67267cc3a214d0417601f3f1877b42525 not found: ID does not exist" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.396436 4713 scope.go:117] "RemoveContainer" containerID="1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69" Mar 14 06:21:44 crc kubenswrapper[4713]: E0314 06:21:44.396747 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69\": container with ID starting with 1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69 not found: ID does not exist" containerID="1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.396791 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69"} err="failed to get container status \"1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69\": rpc error: code = NotFound desc = could not find container \"1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69\": container with ID starting with 1249a483b55f05a9b6caf1b145df5317077c4c59d6436a5ce7de51bfaf4bfb69 not found: ID does not exist" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.786947 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.943563 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory\") pod \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.943811 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0\") pod \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.943983 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9z7h\" (UniqueName: \"kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h\") pod \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.944833 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1\") pod \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.944902 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam\") pod \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\" (UID: \"63e52146-8f23-43ce-99dd-91c5c9f5b42d\") " Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.954066 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h" (OuterVolumeSpecName: "kube-api-access-l9z7h") pod "63e52146-8f23-43ce-99dd-91c5c9f5b42d" (UID: "63e52146-8f23-43ce-99dd-91c5c9f5b42d"). InnerVolumeSpecName "kube-api-access-l9z7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.979427 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory" (OuterVolumeSpecName: "inventory") pod "63e52146-8f23-43ce-99dd-91c5c9f5b42d" (UID: "63e52146-8f23-43ce-99dd-91c5c9f5b42d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.979948 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "63e52146-8f23-43ce-99dd-91c5c9f5b42d" (UID: "63e52146-8f23-43ce-99dd-91c5c9f5b42d"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.983129 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "63e52146-8f23-43ce-99dd-91c5c9f5b42d" (UID: "63e52146-8f23-43ce-99dd-91c5c9f5b42d"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:44 crc kubenswrapper[4713]: I0314 06:21:44.989673 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63e52146-8f23-43ce-99dd-91c5c9f5b42d" (UID: "63e52146-8f23-43ce-99dd-91c5c9f5b42d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.049830 4713 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.049881 4713 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.049904 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9z7h\" (UniqueName: \"kubernetes.io/projected/63e52146-8f23-43ce-99dd-91c5c9f5b42d-kube-api-access-l9z7h\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.049917 4713 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.049929 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63e52146-8f23-43ce-99dd-91c5c9f5b42d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.289642 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" event={"ID":"63e52146-8f23-43ce-99dd-91c5c9f5b42d","Type":"ContainerDied","Data":"3331cf025eb15955bb449af7a72cabda4982493691407adb0517d344307f0d49"} Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.289716 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3331cf025eb15955bb449af7a72cabda4982493691407adb0517d344307f0d49" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.289717 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-9mchp" Mar 14 06:21:45 crc kubenswrapper[4713]: I0314 06:21:45.577746 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" path="/var/lib/kubelet/pods/0e65f5f3-80c6-4ed1-b19a-f312862faacd/volumes" Mar 14 06:21:46 crc kubenswrapper[4713]: I0314 06:21:46.565797 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:21:46 crc kubenswrapper[4713]: E0314 06:21:46.567321 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.173959 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557822-hgfkp"] Mar 14 06:22:00 crc kubenswrapper[4713]: E0314 06:22:00.175844 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="extract-content" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.175935 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="extract-content" Mar 14 06:22:00 crc kubenswrapper[4713]: E0314 06:22:00.176006 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="extract-utilities" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.176063 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="extract-utilities" Mar 14 06:22:00 crc kubenswrapper[4713]: E0314 06:22:00.176149 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="registry-server" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.176228 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="registry-server" Mar 14 06:22:00 crc kubenswrapper[4713]: E0314 06:22:00.176314 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e52146-8f23-43ce-99dd-91c5c9f5b42d" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.176490 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e52146-8f23-43ce-99dd-91c5c9f5b42d" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.176879 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e65f5f3-80c6-4ed1-b19a-f312862faacd" containerName="registry-server" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.177001 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e52146-8f23-43ce-99dd-91c5c9f5b42d" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.178171 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.180963 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.181014 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.180967 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.193656 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557822-hgfkp"] Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.306675 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5mx7\" (UniqueName: \"kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7\") pod \"auto-csr-approver-29557822-hgfkp\" (UID: \"f0e3ff30-59b9-4a11-b23e-8848a31aaac1\") " pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.409957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5mx7\" (UniqueName: \"kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7\") pod \"auto-csr-approver-29557822-hgfkp\" (UID: \"f0e3ff30-59b9-4a11-b23e-8848a31aaac1\") " pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.432595 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5mx7\" (UniqueName: \"kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7\") pod \"auto-csr-approver-29557822-hgfkp\" (UID: \"f0e3ff30-59b9-4a11-b23e-8848a31aaac1\") " pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.503133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:00 crc kubenswrapper[4713]: I0314 06:22:00.564083 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:22:00 crc kubenswrapper[4713]: E0314 06:22:00.564388 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:22:01 crc kubenswrapper[4713]: I0314 06:22:01.019505 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557822-hgfkp"] Mar 14 06:22:01 crc kubenswrapper[4713]: I0314 06:22:01.452231 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" event={"ID":"f0e3ff30-59b9-4a11-b23e-8848a31aaac1","Type":"ContainerStarted","Data":"2131cc38f13145efa02efbd57701005287f8964bdbdf6bad604883067d5eed0a"} Mar 14 06:22:03 crc kubenswrapper[4713]: I0314 06:22:03.496424 4713 generic.go:334] "Generic (PLEG): container finished" podID="f0e3ff30-59b9-4a11-b23e-8848a31aaac1" containerID="620b664642bb05289a04ea12e6d18801f6573e82f01963780dd3441dfa84755a" exitCode=0 Mar 14 06:22:03 crc kubenswrapper[4713]: I0314 06:22:03.496558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" event={"ID":"f0e3ff30-59b9-4a11-b23e-8848a31aaac1","Type":"ContainerDied","Data":"620b664642bb05289a04ea12e6d18801f6573e82f01963780dd3441dfa84755a"} Mar 14 06:22:04 crc kubenswrapper[4713]: I0314 06:22:04.908418 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.030096 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5mx7\" (UniqueName: \"kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7\") pod \"f0e3ff30-59b9-4a11-b23e-8848a31aaac1\" (UID: \"f0e3ff30-59b9-4a11-b23e-8848a31aaac1\") " Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.036009 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7" (OuterVolumeSpecName: "kube-api-access-t5mx7") pod "f0e3ff30-59b9-4a11-b23e-8848a31aaac1" (UID: "f0e3ff30-59b9-4a11-b23e-8848a31aaac1"). InnerVolumeSpecName "kube-api-access-t5mx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.134276 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5mx7\" (UniqueName: \"kubernetes.io/projected/f0e3ff30-59b9-4a11-b23e-8848a31aaac1-kube-api-access-t5mx7\") on node \"crc\" DevicePath \"\"" Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.519687 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" event={"ID":"f0e3ff30-59b9-4a11-b23e-8848a31aaac1","Type":"ContainerDied","Data":"2131cc38f13145efa02efbd57701005287f8964bdbdf6bad604883067d5eed0a"} Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.519970 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2131cc38f13145efa02efbd57701005287f8964bdbdf6bad604883067d5eed0a" Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.519757 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557822-hgfkp" Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.982941 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557816-z8vqm"] Mar 14 06:22:05 crc kubenswrapper[4713]: I0314 06:22:05.993782 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557816-z8vqm"] Mar 14 06:22:07 crc kubenswrapper[4713]: I0314 06:22:07.579405 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e826e6-870e-4d27-afbe-fe1547e39e61" path="/var/lib/kubelet/pods/47e826e6-870e-4d27-afbe-fe1547e39e61/volumes" Mar 14 06:22:14 crc kubenswrapper[4713]: I0314 06:22:14.564141 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:22:14 crc kubenswrapper[4713]: E0314 06:22:14.565095 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:22:28 crc kubenswrapper[4713]: I0314 06:22:28.563586 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:22:28 crc kubenswrapper[4713]: E0314 06:22:28.564380 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:22:42 crc kubenswrapper[4713]: I0314 06:22:42.564475 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:22:42 crc kubenswrapper[4713]: E0314 06:22:42.565980 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:22:51 crc kubenswrapper[4713]: I0314 06:22:51.694461 4713 scope.go:117] "RemoveContainer" containerID="9ae085736bd56fae1c869c5579fc76ee2f37023947d37d167da40faf6384bb2a" Mar 14 06:22:55 crc kubenswrapper[4713]: I0314 06:22:55.564332 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:22:55 crc kubenswrapper[4713]: E0314 06:22:55.565695 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:23:07 crc kubenswrapper[4713]: I0314 06:23:07.572234 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:23:07 crc kubenswrapper[4713]: E0314 06:23:07.573725 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:23:19 crc kubenswrapper[4713]: I0314 06:23:19.564855 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:23:19 crc kubenswrapper[4713]: E0314 06:23:19.566760 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.634805 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:23:21 crc kubenswrapper[4713]: E0314 06:23:21.636086 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e3ff30-59b9-4a11-b23e-8848a31aaac1" containerName="oc" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.636103 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e3ff30-59b9-4a11-b23e-8848a31aaac1" containerName="oc" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.636465 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e3ff30-59b9-4a11-b23e-8848a31aaac1" containerName="oc" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.643467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.655914 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.739888 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.739947 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxx9m\" (UniqueName: \"kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.740023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.842030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.842087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxx9m\" (UniqueName: \"kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.842227 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.842637 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.842718 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.865532 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxx9m\" (UniqueName: \"kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m\") pod \"redhat-operators-dzprg\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:21 crc kubenswrapper[4713]: I0314 06:23:21.979997 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:22 crc kubenswrapper[4713]: I0314 06:23:22.515308 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:23:22 crc kubenswrapper[4713]: W0314 06:23:22.516307 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce37417_d7aa_4135_b4e8_c28ca7476613.slice/crio-f60cd4d0aee9787a1c1225e8bc76a944c3f014cd0110f34c9afb6b2f95b06b79 WatchSource:0}: Error finding container f60cd4d0aee9787a1c1225e8bc76a944c3f014cd0110f34c9afb6b2f95b06b79: Status 404 returned error can't find the container with id f60cd4d0aee9787a1c1225e8bc76a944c3f014cd0110f34c9afb6b2f95b06b79 Mar 14 06:23:22 crc kubenswrapper[4713]: I0314 06:23:22.620670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerStarted","Data":"f60cd4d0aee9787a1c1225e8bc76a944c3f014cd0110f34c9afb6b2f95b06b79"} Mar 14 06:23:23 crc kubenswrapper[4713]: I0314 06:23:23.680060 4713 generic.go:334] "Generic (PLEG): container finished" podID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerID="715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769" exitCode=0 Mar 14 06:23:23 crc kubenswrapper[4713]: I0314 06:23:23.680125 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerDied","Data":"715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769"} Mar 14 06:23:23 crc kubenswrapper[4713]: I0314 06:23:23.696559 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:23:25 crc kubenswrapper[4713]: I0314 06:23:25.703534 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerStarted","Data":"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac"} Mar 14 06:23:31 crc kubenswrapper[4713]: I0314 06:23:31.564607 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:23:31 crc kubenswrapper[4713]: E0314 06:23:31.565641 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:23:31 crc kubenswrapper[4713]: I0314 06:23:31.772058 4713 generic.go:334] "Generic (PLEG): container finished" podID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerID="535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac" exitCode=0 Mar 14 06:23:31 crc kubenswrapper[4713]: I0314 06:23:31.772105 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerDied","Data":"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac"} Mar 14 06:23:33 crc kubenswrapper[4713]: I0314 06:23:33.794855 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerStarted","Data":"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47"} Mar 14 06:23:33 crc kubenswrapper[4713]: I0314 06:23:33.825507 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzprg" podStartSLOduration=3.943340385 podStartE2EDuration="12.825470785s" podCreationTimestamp="2026-03-14 06:23:21 +0000 UTC" firstStartedPulling="2026-03-14 06:23:23.696337106 +0000 UTC m=+3386.784246406" lastFinishedPulling="2026-03-14 06:23:32.578467506 +0000 UTC m=+3395.666376806" observedRunningTime="2026-03-14 06:23:33.818875438 +0000 UTC m=+3396.906784748" watchObservedRunningTime="2026-03-14 06:23:33.825470785 +0000 UTC m=+3396.913380095" Mar 14 06:23:41 crc kubenswrapper[4713]: I0314 06:23:41.981264 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:41 crc kubenswrapper[4713]: I0314 06:23:41.981838 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:23:43 crc kubenswrapper[4713]: I0314 06:23:43.034476 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzprg" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" probeResult="failure" output=< Mar 14 06:23:43 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:23:43 crc kubenswrapper[4713]: > Mar 14 06:23:45 crc kubenswrapper[4713]: I0314 06:23:45.564525 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:23:45 crc kubenswrapper[4713]: E0314 06:23:45.565158 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:23:53 crc kubenswrapper[4713]: I0314 06:23:53.047241 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzprg" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" probeResult="failure" output=< Mar 14 06:23:53 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:23:53 crc kubenswrapper[4713]: > Mar 14 06:23:59 crc kubenswrapper[4713]: I0314 06:23:59.564127 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:23:59 crc kubenswrapper[4713]: E0314 06:23:59.564942 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.153387 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557824-m429t"] Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.157397 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.160019 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.160814 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.161003 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.171591 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557824-m429t"] Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.276726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8dh\" (UniqueName: \"kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh\") pod \"auto-csr-approver-29557824-m429t\" (UID: \"2e5b8637-afeb-47e9-95c4-5d69b6833b45\") " pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.378552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8dh\" (UniqueName: \"kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh\") pod \"auto-csr-approver-29557824-m429t\" (UID: \"2e5b8637-afeb-47e9-95c4-5d69b6833b45\") " pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.398270 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8dh\" (UniqueName: \"kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh\") pod \"auto-csr-approver-29557824-m429t\" (UID: \"2e5b8637-afeb-47e9-95c4-5d69b6833b45\") " pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.481501 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:00 crc kubenswrapper[4713]: I0314 06:24:00.956915 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557824-m429t"] Mar 14 06:24:01 crc kubenswrapper[4713]: I0314 06:24:01.106561 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557824-m429t" event={"ID":"2e5b8637-afeb-47e9-95c4-5d69b6833b45","Type":"ContainerStarted","Data":"99a843c2a7795082df7ceea8189e899e30fa16ae98850e460bb61325b1f5141c"} Mar 14 06:24:03 crc kubenswrapper[4713]: I0314 06:24:03.036132 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzprg" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" probeResult="failure" output=< Mar 14 06:24:03 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:24:03 crc kubenswrapper[4713]: > Mar 14 06:24:03 crc kubenswrapper[4713]: I0314 06:24:03.130967 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557824-m429t" event={"ID":"2e5b8637-afeb-47e9-95c4-5d69b6833b45","Type":"ContainerStarted","Data":"5281632e9ca61337a6079843fc3681d7575b5026e9ae4878809b4e03fb81a9a0"} Mar 14 06:24:03 crc kubenswrapper[4713]: I0314 06:24:03.148514 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557824-m429t" podStartSLOduration=1.493686358 podStartE2EDuration="3.148489637s" podCreationTimestamp="2026-03-14 06:24:00 +0000 UTC" firstStartedPulling="2026-03-14 06:24:00.959096947 +0000 UTC m=+3424.047006247" lastFinishedPulling="2026-03-14 06:24:02.613900226 +0000 UTC m=+3425.701809526" observedRunningTime="2026-03-14 06:24:03.146595057 +0000 UTC m=+3426.234504367" watchObservedRunningTime="2026-03-14 06:24:03.148489637 +0000 UTC m=+3426.236398947" Mar 14 06:24:04 crc kubenswrapper[4713]: I0314 06:24:04.143351 4713 generic.go:334] "Generic (PLEG): container finished" podID="2e5b8637-afeb-47e9-95c4-5d69b6833b45" containerID="5281632e9ca61337a6079843fc3681d7575b5026e9ae4878809b4e03fb81a9a0" exitCode=0 Mar 14 06:24:04 crc kubenswrapper[4713]: I0314 06:24:04.143652 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557824-m429t" event={"ID":"2e5b8637-afeb-47e9-95c4-5d69b6833b45","Type":"ContainerDied","Data":"5281632e9ca61337a6079843fc3681d7575b5026e9ae4878809b4e03fb81a9a0"} Mar 14 06:24:05 crc kubenswrapper[4713]: I0314 06:24:05.553688 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:05 crc kubenswrapper[4713]: I0314 06:24:05.722849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr8dh\" (UniqueName: \"kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh\") pod \"2e5b8637-afeb-47e9-95c4-5d69b6833b45\" (UID: \"2e5b8637-afeb-47e9-95c4-5d69b6833b45\") " Mar 14 06:24:05 crc kubenswrapper[4713]: I0314 06:24:05.731366 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh" (OuterVolumeSpecName: "kube-api-access-lr8dh") pod "2e5b8637-afeb-47e9-95c4-5d69b6833b45" (UID: "2e5b8637-afeb-47e9-95c4-5d69b6833b45"). InnerVolumeSpecName "kube-api-access-lr8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:24:05 crc kubenswrapper[4713]: I0314 06:24:05.826684 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr8dh\" (UniqueName: \"kubernetes.io/projected/2e5b8637-afeb-47e9-95c4-5d69b6833b45-kube-api-access-lr8dh\") on node \"crc\" DevicePath \"\"" Mar 14 06:24:06 crc kubenswrapper[4713]: I0314 06:24:06.169539 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557824-m429t" event={"ID":"2e5b8637-afeb-47e9-95c4-5d69b6833b45","Type":"ContainerDied","Data":"99a843c2a7795082df7ceea8189e899e30fa16ae98850e460bb61325b1f5141c"} Mar 14 06:24:06 crc kubenswrapper[4713]: I0314 06:24:06.169606 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a843c2a7795082df7ceea8189e899e30fa16ae98850e460bb61325b1f5141c" Mar 14 06:24:06 crc kubenswrapper[4713]: I0314 06:24:06.169621 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557824-m429t" Mar 14 06:24:06 crc kubenswrapper[4713]: I0314 06:24:06.647270 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557818-c9p75"] Mar 14 06:24:06 crc kubenswrapper[4713]: I0314 06:24:06.657134 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557818-c9p75"] Mar 14 06:24:07 crc kubenswrapper[4713]: I0314 06:24:07.578269 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5384776-a77c-47d9-b97c-ac482b54cf84" path="/var/lib/kubelet/pods/c5384776-a77c-47d9-b97c-ac482b54cf84/volumes" Mar 14 06:24:13 crc kubenswrapper[4713]: I0314 06:24:13.130451 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzprg" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" probeResult="failure" output=< Mar 14 06:24:13 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:24:13 crc kubenswrapper[4713]: > Mar 14 06:24:14 crc kubenswrapper[4713]: I0314 06:24:14.563571 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:24:14 crc kubenswrapper[4713]: E0314 06:24:14.564115 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:24:22 crc kubenswrapper[4713]: I0314 06:24:22.036975 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:24:22 crc kubenswrapper[4713]: I0314 06:24:22.095498 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:24:22 crc kubenswrapper[4713]: I0314 06:24:22.282018 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:24:23 crc kubenswrapper[4713]: I0314 06:24:23.571947 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzprg" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" containerID="cri-o://abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47" gracePeriod=2 Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.180793 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.261418 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content\") pod \"6ce37417-d7aa-4135-b4e8-c28ca7476613\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.261827 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities\") pod \"6ce37417-d7aa-4135-b4e8-c28ca7476613\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.261988 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxx9m\" (UniqueName: \"kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m\") pod \"6ce37417-d7aa-4135-b4e8-c28ca7476613\" (UID: \"6ce37417-d7aa-4135-b4e8-c28ca7476613\") " Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.262656 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities" (OuterVolumeSpecName: "utilities") pod "6ce37417-d7aa-4135-b4e8-c28ca7476613" (UID: "6ce37417-d7aa-4135-b4e8-c28ca7476613"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.276872 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m" (OuterVolumeSpecName: "kube-api-access-fxx9m") pod "6ce37417-d7aa-4135-b4e8-c28ca7476613" (UID: "6ce37417-d7aa-4135-b4e8-c28ca7476613"). InnerVolumeSpecName "kube-api-access-fxx9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.365201 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.365243 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxx9m\" (UniqueName: \"kubernetes.io/projected/6ce37417-d7aa-4135-b4e8-c28ca7476613-kube-api-access-fxx9m\") on node \"crc\" DevicePath \"\"" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.409842 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ce37417-d7aa-4135-b4e8-c28ca7476613" (UID: "6ce37417-d7aa-4135-b4e8-c28ca7476613"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.471348 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce37417-d7aa-4135-b4e8-c28ca7476613-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.591338 4713 generic.go:334] "Generic (PLEG): container finished" podID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerID="abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47" exitCode=0 Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.591412 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerDied","Data":"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47"} Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.591459 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzprg" event={"ID":"6ce37417-d7aa-4135-b4e8-c28ca7476613","Type":"ContainerDied","Data":"f60cd4d0aee9787a1c1225e8bc76a944c3f014cd0110f34c9afb6b2f95b06b79"} Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.591488 4713 scope.go:117] "RemoveContainer" containerID="abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.591491 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzprg" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.630936 4713 scope.go:117] "RemoveContainer" containerID="535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.655583 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.668353 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzprg"] Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.673008 4713 scope.go:117] "RemoveContainer" containerID="715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.736388 4713 scope.go:117] "RemoveContainer" containerID="abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47" Mar 14 06:24:24 crc kubenswrapper[4713]: E0314 06:24:24.736998 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47\": container with ID starting with abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47 not found: ID does not exist" containerID="abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.737040 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47"} err="failed to get container status \"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47\": rpc error: code = NotFound desc = could not find container \"abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47\": container with ID starting with abb7c2379b02a4d252b1cffafd02b272c24eb75cbca774f489ff0d82d6bf8a47 not found: ID does not exist" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.737072 4713 scope.go:117] "RemoveContainer" containerID="535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac" Mar 14 06:24:24 crc kubenswrapper[4713]: E0314 06:24:24.737592 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac\": container with ID starting with 535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac not found: ID does not exist" containerID="535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.737679 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac"} err="failed to get container status \"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac\": rpc error: code = NotFound desc = could not find container \"535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac\": container with ID starting with 535b1ef8c575b8305acb7c3c835ecd63e4bf4cc31852e8bc3bab9c57949158ac not found: ID does not exist" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.737746 4713 scope.go:117] "RemoveContainer" containerID="715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769" Mar 14 06:24:24 crc kubenswrapper[4713]: E0314 06:24:24.738118 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769\": container with ID starting with 715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769 not found: ID does not exist" containerID="715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769" Mar 14 06:24:24 crc kubenswrapper[4713]: I0314 06:24:24.738154 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769"} err="failed to get container status \"715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769\": rpc error: code = NotFound desc = could not find container \"715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769\": container with ID starting with 715b32e8451033f8d67b13e281410c31e169b1742811941a4409928a6e713769 not found: ID does not exist" Mar 14 06:24:25 crc kubenswrapper[4713]: I0314 06:24:25.577032 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" path="/var/lib/kubelet/pods/6ce37417-d7aa-4135-b4e8-c28ca7476613/volumes" Mar 14 06:24:26 crc kubenswrapper[4713]: I0314 06:24:26.564950 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:24:26 crc kubenswrapper[4713]: E0314 06:24:26.565551 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:24:38 crc kubenswrapper[4713]: I0314 06:24:38.563764 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:24:38 crc kubenswrapper[4713]: E0314 06:24:38.564615 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:24:51 crc kubenswrapper[4713]: I0314 06:24:51.563551 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:24:51 crc kubenswrapper[4713]: I0314 06:24:51.841481 4713 scope.go:117] "RemoveContainer" containerID="6bb1b9db7aae4d473df5b59169a640e46a43ca8e781b56f427a1c60fe8e7eeb1" Mar 14 06:24:51 crc kubenswrapper[4713]: I0314 06:24:51.995733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c"} Mar 14 06:25:52 crc kubenswrapper[4713]: I0314 06:25:52.266639 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:25:52 crc kubenswrapper[4713]: I0314 06:25:52.267259 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:25:52 crc kubenswrapper[4713]: I0314 06:25:52.266682 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:25:52 crc kubenswrapper[4713]: I0314 06:25:52.267342 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:25:52 crc kubenswrapper[4713]: E0314 06:25:52.607608 4713 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.044s" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.147036 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557826-mbqgx"] Mar 14 06:26:00 crc kubenswrapper[4713]: E0314 06:26:00.148067 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="extract-content" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148081 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="extract-content" Mar 14 06:26:00 crc kubenswrapper[4713]: E0314 06:26:00.148121 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148128 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" Mar 14 06:26:00 crc kubenswrapper[4713]: E0314 06:26:00.148140 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="extract-utilities" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148148 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="extract-utilities" Mar 14 06:26:00 crc kubenswrapper[4713]: E0314 06:26:00.148158 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5b8637-afeb-47e9-95c4-5d69b6833b45" containerName="oc" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148164 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5b8637-afeb-47e9-95c4-5d69b6833b45" containerName="oc" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148384 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce37417-d7aa-4135-b4e8-c28ca7476613" containerName="registry-server" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.148409 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5b8637-afeb-47e9-95c4-5d69b6833b45" containerName="oc" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.149232 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.152025 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.152313 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.152492 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.164036 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557826-mbqgx"] Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.333771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhnzc\" (UniqueName: \"kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc\") pod \"auto-csr-approver-29557826-mbqgx\" (UID: \"c88a3916-44fd-4e2d-a480-daa6e3c65230\") " pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.436154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhnzc\" (UniqueName: \"kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc\") pod \"auto-csr-approver-29557826-mbqgx\" (UID: \"c88a3916-44fd-4e2d-a480-daa6e3c65230\") " pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.453808 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhnzc\" (UniqueName: \"kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc\") pod \"auto-csr-approver-29557826-mbqgx\" (UID: \"c88a3916-44fd-4e2d-a480-daa6e3c65230\") " pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.467800 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:00 crc kubenswrapper[4713]: I0314 06:26:00.954347 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557826-mbqgx"] Mar 14 06:26:01 crc kubenswrapper[4713]: I0314 06:26:01.003546 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" event={"ID":"c88a3916-44fd-4e2d-a480-daa6e3c65230","Type":"ContainerStarted","Data":"c930dce4324e259bd1e1b531c72f6ad13b022c56ff01f28a189e49c196874711"} Mar 14 06:26:03 crc kubenswrapper[4713]: I0314 06:26:03.035949 4713 generic.go:334] "Generic (PLEG): container finished" podID="c88a3916-44fd-4e2d-a480-daa6e3c65230" containerID="204c9c5165fb3b19b239e26b0df833e3ed2f773f1bf30fa69f1131df1bb19193" exitCode=0 Mar 14 06:26:03 crc kubenswrapper[4713]: I0314 06:26:03.036561 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" event={"ID":"c88a3916-44fd-4e2d-a480-daa6e3c65230","Type":"ContainerDied","Data":"204c9c5165fb3b19b239e26b0df833e3ed2f773f1bf30fa69f1131df1bb19193"} Mar 14 06:26:04 crc kubenswrapper[4713]: I0314 06:26:04.501345 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:04 crc kubenswrapper[4713]: I0314 06:26:04.672493 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhnzc\" (UniqueName: \"kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc\") pod \"c88a3916-44fd-4e2d-a480-daa6e3c65230\" (UID: \"c88a3916-44fd-4e2d-a480-daa6e3c65230\") " Mar 14 06:26:04 crc kubenswrapper[4713]: I0314 06:26:04.681771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc" (OuterVolumeSpecName: "kube-api-access-bhnzc") pod "c88a3916-44fd-4e2d-a480-daa6e3c65230" (UID: "c88a3916-44fd-4e2d-a480-daa6e3c65230"). InnerVolumeSpecName "kube-api-access-bhnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:26:04 crc kubenswrapper[4713]: I0314 06:26:04.776868 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhnzc\" (UniqueName: \"kubernetes.io/projected/c88a3916-44fd-4e2d-a480-daa6e3c65230-kube-api-access-bhnzc\") on node \"crc\" DevicePath \"\"" Mar 14 06:26:05 crc kubenswrapper[4713]: I0314 06:26:05.060953 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" event={"ID":"c88a3916-44fd-4e2d-a480-daa6e3c65230","Type":"ContainerDied","Data":"c930dce4324e259bd1e1b531c72f6ad13b022c56ff01f28a189e49c196874711"} Mar 14 06:26:05 crc kubenswrapper[4713]: I0314 06:26:05.061365 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c930dce4324e259bd1e1b531c72f6ad13b022c56ff01f28a189e49c196874711" Mar 14 06:26:05 crc kubenswrapper[4713]: I0314 06:26:05.061012 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557826-mbqgx" Mar 14 06:26:05 crc kubenswrapper[4713]: I0314 06:26:05.580014 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557820-lqtpb"] Mar 14 06:26:05 crc kubenswrapper[4713]: I0314 06:26:05.586925 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557820-lqtpb"] Mar 14 06:26:07 crc kubenswrapper[4713]: I0314 06:26:07.581181 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca598d50-9940-480e-b114-c4df47bd2bc5" path="/var/lib/kubelet/pods/ca598d50-9940-480e-b114-c4df47bd2bc5/volumes" Mar 14 06:26:11 crc kubenswrapper[4713]: I0314 06:26:11.314678 4713 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" start-of-body= Mar 14 06:26:11 crc kubenswrapper[4713]: I0314 06:26:11.316294 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" Mar 14 06:26:14 crc kubenswrapper[4713]: I0314 06:26:14.374938 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={ Mar 14 06:26:14 crc kubenswrapper[4713]: "http": "Get \"http://localhost:8082\": context deadline exceeded" Mar 14 06:26:14 crc kubenswrapper[4713]: } Mar 14 06:26:14 crc kubenswrapper[4713]: I0314 06:26:14.375321 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 06:26:39 crc kubenswrapper[4713]: I0314 06:26:39.870223 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:26:52 crc kubenswrapper[4713]: I0314 06:26:52.684572 4713 scope.go:117] "RemoveContainer" containerID="ccaf1261aeb7b6543c4a9dec5b2f8231a1663039f4204023b77c6c7f826ea488" Mar 14 06:27:10 crc kubenswrapper[4713]: I0314 06:27:10.732012 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:27:10 crc kubenswrapper[4713]: I0314 06:27:10.732609 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:27:40 crc kubenswrapper[4713]: I0314 06:27:40.731463 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:27:40 crc kubenswrapper[4713]: I0314 06:27:40.732354 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.473391 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:27:57 crc kubenswrapper[4713]: E0314 06:27:57.474546 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88a3916-44fd-4e2d-a480-daa6e3c65230" containerName="oc" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.474560 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88a3916-44fd-4e2d-a480-daa6e3c65230" containerName="oc" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.474790 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88a3916-44fd-4e2d-a480-daa6e3c65230" containerName="oc" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.476835 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.541406 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.581116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmjd\" (UniqueName: \"kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.581357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.581417 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.684031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmjd\" (UniqueName: \"kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.684087 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.684112 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.684656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.685458 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.702229 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmjd\" (UniqueName: \"kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd\") pod \"certified-operators-kldkd\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:57 crc kubenswrapper[4713]: I0314 06:27:57.801990 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:27:58 crc kubenswrapper[4713]: I0314 06:27:58.485023 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:27:58 crc kubenswrapper[4713]: I0314 06:27:58.546974 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerStarted","Data":"e8717818ae9e0979af53b3fcf976bb1dd0e6f3cb638debc6334bca92fdf54bfc"} Mar 14 06:27:59 crc kubenswrapper[4713]: I0314 06:27:59.560338 4713 generic.go:334] "Generic (PLEG): container finished" podID="e9da763a-3d66-47b3-a097-715377b1cf86" containerID="e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb" exitCode=0 Mar 14 06:27:59 crc kubenswrapper[4713]: I0314 06:27:59.560446 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerDied","Data":"e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb"} Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.151105 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557828-2nlzd"] Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.152754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.156288 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.157195 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.157368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.165624 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557828-2nlzd"] Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.273662 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxgh\" (UniqueName: \"kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh\") pod \"auto-csr-approver-29557828-2nlzd\" (UID: \"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb\") " pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.375997 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxgh\" (UniqueName: \"kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh\") pod \"auto-csr-approver-29557828-2nlzd\" (UID: \"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb\") " pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.412961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxgh\" (UniqueName: \"kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh\") pod \"auto-csr-approver-29557828-2nlzd\" (UID: \"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb\") " pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:00 crc kubenswrapper[4713]: I0314 06:28:00.510788 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:01 crc kubenswrapper[4713]: I0314 06:28:01.583518 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerStarted","Data":"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b"} Mar 14 06:28:01 crc kubenswrapper[4713]: I0314 06:28:01.621975 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557828-2nlzd"] Mar 14 06:28:01 crc kubenswrapper[4713]: W0314 06:28:01.637514 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1807a6a6_3794_4fab_bdc7_17c83fa3fdeb.slice/crio-5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732 WatchSource:0}: Error finding container 5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732: Status 404 returned error can't find the container with id 5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732 Mar 14 06:28:02 crc kubenswrapper[4713]: I0314 06:28:02.594393 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" event={"ID":"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb","Type":"ContainerStarted","Data":"5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732"} Mar 14 06:28:03 crc kubenswrapper[4713]: I0314 06:28:03.606994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" event={"ID":"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb","Type":"ContainerStarted","Data":"eaa021c6f943b25fcdcdc9073daad184cb58b392754fd96ddfef12c6f0a9b43f"} Mar 14 06:28:03 crc kubenswrapper[4713]: I0314 06:28:03.624638 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" podStartSLOduration=2.46817487 podStartE2EDuration="3.624618884s" podCreationTimestamp="2026-03-14 06:28:00 +0000 UTC" firstStartedPulling="2026-03-14 06:28:01.644025874 +0000 UTC m=+3664.731935174" lastFinishedPulling="2026-03-14 06:28:02.800469888 +0000 UTC m=+3665.888379188" observedRunningTime="2026-03-14 06:28:03.622415306 +0000 UTC m=+3666.710324616" watchObservedRunningTime="2026-03-14 06:28:03.624618884 +0000 UTC m=+3666.712528174" Mar 14 06:28:04 crc kubenswrapper[4713]: I0314 06:28:04.620944 4713 generic.go:334] "Generic (PLEG): container finished" podID="1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" containerID="eaa021c6f943b25fcdcdc9073daad184cb58b392754fd96ddfef12c6f0a9b43f" exitCode=0 Mar 14 06:28:04 crc kubenswrapper[4713]: I0314 06:28:04.621136 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" event={"ID":"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb","Type":"ContainerDied","Data":"eaa021c6f943b25fcdcdc9073daad184cb58b392754fd96ddfef12c6f0a9b43f"} Mar 14 06:28:05 crc kubenswrapper[4713]: I0314 06:28:05.637680 4713 generic.go:334] "Generic (PLEG): container finished" podID="e9da763a-3d66-47b3-a097-715377b1cf86" containerID="cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b" exitCode=0 Mar 14 06:28:05 crc kubenswrapper[4713]: I0314 06:28:05.637733 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerDied","Data":"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b"} Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.143659 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.292484 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrxgh\" (UniqueName: \"kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh\") pod \"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb\" (UID: \"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb\") " Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.307688 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh" (OuterVolumeSpecName: "kube-api-access-qrxgh") pod "1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" (UID: "1807a6a6-3794-4fab-bdc7-17c83fa3fdeb"). InnerVolumeSpecName "kube-api-access-qrxgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.395902 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrxgh\" (UniqueName: \"kubernetes.io/projected/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb-kube-api-access-qrxgh\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.651041 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerStarted","Data":"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd"} Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.653358 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" event={"ID":"1807a6a6-3794-4fab-bdc7-17c83fa3fdeb","Type":"ContainerDied","Data":"5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732"} Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.653380 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac3f74ff16990263461f51ef0de0afd9dfd903a671f4416c6af2676b5eca732" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.653603 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557828-2nlzd" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.694242 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kldkd" podStartSLOduration=3.090702538 podStartE2EDuration="9.694193363s" podCreationTimestamp="2026-03-14 06:27:57 +0000 UTC" firstStartedPulling="2026-03-14 06:27:59.566856646 +0000 UTC m=+3662.654765946" lastFinishedPulling="2026-03-14 06:28:06.170347471 +0000 UTC m=+3669.258256771" observedRunningTime="2026-03-14 06:28:06.683523068 +0000 UTC m=+3669.771432398" watchObservedRunningTime="2026-03-14 06:28:06.694193363 +0000 UTC m=+3669.782102663" Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.740137 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557822-hgfkp"] Mar 14 06:28:06 crc kubenswrapper[4713]: I0314 06:28:06.754650 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557822-hgfkp"] Mar 14 06:28:07 crc kubenswrapper[4713]: I0314 06:28:07.591171 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e3ff30-59b9-4a11-b23e-8848a31aaac1" path="/var/lib/kubelet/pods/f0e3ff30-59b9-4a11-b23e-8848a31aaac1/volumes" Mar 14 06:28:07 crc kubenswrapper[4713]: I0314 06:28:07.802163 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:07 crc kubenswrapper[4713]: I0314 06:28:07.802241 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:08 crc kubenswrapper[4713]: I0314 06:28:08.869541 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kldkd" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="registry-server" probeResult="failure" output=< Mar 14 06:28:08 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:28:08 crc kubenswrapper[4713]: > Mar 14 06:28:10 crc kubenswrapper[4713]: I0314 06:28:10.731665 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:28:10 crc kubenswrapper[4713]: I0314 06:28:10.732416 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:28:10 crc kubenswrapper[4713]: I0314 06:28:10.732460 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:28:10 crc kubenswrapper[4713]: I0314 06:28:10.733029 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:28:10 crc kubenswrapper[4713]: I0314 06:28:10.733075 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c" gracePeriod=600 Mar 14 06:28:11 crc kubenswrapper[4713]: I0314 06:28:11.735548 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c" exitCode=0 Mar 14 06:28:11 crc kubenswrapper[4713]: I0314 06:28:11.735628 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c"} Mar 14 06:28:11 crc kubenswrapper[4713]: I0314 06:28:11.735923 4713 scope.go:117] "RemoveContainer" containerID="caee9675b58e4ed6bf8882c3ca66773ef27f363a2d03c287b095e2080a670160" Mar 14 06:28:12 crc kubenswrapper[4713]: I0314 06:28:12.751291 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387"} Mar 14 06:28:17 crc kubenswrapper[4713]: I0314 06:28:17.870446 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:17 crc kubenswrapper[4713]: I0314 06:28:17.929661 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:18 crc kubenswrapper[4713]: I0314 06:28:18.116073 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:28:19 crc kubenswrapper[4713]: I0314 06:28:19.905323 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kldkd" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="registry-server" containerID="cri-o://12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd" gracePeriod=2 Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.603105 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.683146 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmjd\" (UniqueName: \"kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd\") pod \"e9da763a-3d66-47b3-a097-715377b1cf86\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.683545 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities\") pod \"e9da763a-3d66-47b3-a097-715377b1cf86\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.683585 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content\") pod \"e9da763a-3d66-47b3-a097-715377b1cf86\" (UID: \"e9da763a-3d66-47b3-a097-715377b1cf86\") " Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.686486 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities" (OuterVolumeSpecName: "utilities") pod "e9da763a-3d66-47b3-a097-715377b1cf86" (UID: "e9da763a-3d66-47b3-a097-715377b1cf86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.700328 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd" (OuterVolumeSpecName: "kube-api-access-7gmjd") pod "e9da763a-3d66-47b3-a097-715377b1cf86" (UID: "e9da763a-3d66-47b3-a097-715377b1cf86"). InnerVolumeSpecName "kube-api-access-7gmjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.787356 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.787400 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmjd\" (UniqueName: \"kubernetes.io/projected/e9da763a-3d66-47b3-a097-715377b1cf86-kube-api-access-7gmjd\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.797068 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9da763a-3d66-47b3-a097-715377b1cf86" (UID: "e9da763a-3d66-47b3-a097-715377b1cf86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.891260 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9da763a-3d66-47b3-a097-715377b1cf86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.920348 4713 generic.go:334] "Generic (PLEG): container finished" podID="e9da763a-3d66-47b3-a097-715377b1cf86" containerID="12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd" exitCode=0 Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.920398 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerDied","Data":"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd"} Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.920415 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkd" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.920439 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkd" event={"ID":"e9da763a-3d66-47b3-a097-715377b1cf86","Type":"ContainerDied","Data":"e8717818ae9e0979af53b3fcf976bb1dd0e6f3cb638debc6334bca92fdf54bfc"} Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.920461 4713 scope.go:117] "RemoveContainer" containerID="12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.955665 4713 scope.go:117] "RemoveContainer" containerID="cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.972854 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.984866 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kldkd"] Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:20.985823 4713 scope.go:117] "RemoveContainer" containerID="e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.048096 4713 scope.go:117] "RemoveContainer" containerID="12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd" Mar 14 06:28:21 crc kubenswrapper[4713]: E0314 06:28:21.050322 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd\": container with ID starting with 12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd not found: ID does not exist" containerID="12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.050348 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd"} err="failed to get container status \"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd\": rpc error: code = NotFound desc = could not find container \"12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd\": container with ID starting with 12bf09904670f7a2fc9851a4e3ada35ac1b30c38ef3d12d1afc12669ee76cfbd not found: ID does not exist" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.050372 4713 scope.go:117] "RemoveContainer" containerID="cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b" Mar 14 06:28:21 crc kubenswrapper[4713]: E0314 06:28:21.050558 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b\": container with ID starting with cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b not found: ID does not exist" containerID="cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.050578 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b"} err="failed to get container status \"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b\": rpc error: code = NotFound desc = could not find container \"cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b\": container with ID starting with cc7cee07c7c5e4fc8654daeb5ec68a8516e106aaf37c5ed8c53ea81440e6216b not found: ID does not exist" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.050592 4713 scope.go:117] "RemoveContainer" containerID="e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb" Mar 14 06:28:21 crc kubenswrapper[4713]: E0314 06:28:21.051092 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb\": container with ID starting with e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb not found: ID does not exist" containerID="e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.051107 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb"} err="failed to get container status \"e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb\": rpc error: code = NotFound desc = could not find container \"e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb\": container with ID starting with e27c2290e90e7a613eb999f88f013754eb4e3246a653d65935f146d61647b8eb not found: ID does not exist" Mar 14 06:28:21 crc kubenswrapper[4713]: I0314 06:28:21.577500 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" path="/var/lib/kubelet/pods/e9da763a-3d66-47b3-a097-715377b1cf86/volumes" Mar 14 06:28:29 crc kubenswrapper[4713]: E0314 06:28:29.711644 4713 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.148s" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.726321 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmvfw"] Mar 14 06:28:33 crc kubenswrapper[4713]: E0314 06:28:33.727704 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="extract-content" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.727724 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="extract-content" Mar 14 06:28:33 crc kubenswrapper[4713]: E0314 06:28:33.727763 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" containerName="oc" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.727776 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" containerName="oc" Mar 14 06:28:33 crc kubenswrapper[4713]: E0314 06:28:33.727810 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="registry-server" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.727821 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="registry-server" Mar 14 06:28:33 crc kubenswrapper[4713]: E0314 06:28:33.727847 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="extract-utilities" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.727857 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="extract-utilities" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.728148 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9da763a-3d66-47b3-a097-715377b1cf86" containerName="registry-server" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.728171 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" containerName="oc" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.730172 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.740152 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmvfw"] Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.789090 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-utilities\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.789243 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96lkk\" (UniqueName: \"kubernetes.io/projected/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-kube-api-access-96lkk\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.789300 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-catalog-content\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.891300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96lkk\" (UniqueName: \"kubernetes.io/projected/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-kube-api-access-96lkk\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.891394 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-catalog-content\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.891580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-utilities\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.891930 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-catalog-content\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.891996 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-utilities\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:33 crc kubenswrapper[4713]: I0314 06:28:33.911186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96lkk\" (UniqueName: \"kubernetes.io/projected/e412202e-9dd7-4ebb-90a4-c25cbf3241b8-kube-api-access-96lkk\") pod \"community-operators-vmvfw\" (UID: \"e412202e-9dd7-4ebb-90a4-c25cbf3241b8\") " pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:34 crc kubenswrapper[4713]: I0314 06:28:34.061849 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:34 crc kubenswrapper[4713]: I0314 06:28:34.601308 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmvfw"] Mar 14 06:28:34 crc kubenswrapper[4713]: W0314 06:28:34.603802 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode412202e_9dd7_4ebb_90a4_c25cbf3241b8.slice/crio-70f9e6266342c5f967016a931a5380710b2d362852d9bfff7e16aded747b79ed WatchSource:0}: Error finding container 70f9e6266342c5f967016a931a5380710b2d362852d9bfff7e16aded747b79ed: Status 404 returned error can't find the container with id 70f9e6266342c5f967016a931a5380710b2d362852d9bfff7e16aded747b79ed Mar 14 06:28:35 crc kubenswrapper[4713]: I0314 06:28:35.180001 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmvfw" event={"ID":"e412202e-9dd7-4ebb-90a4-c25cbf3241b8","Type":"ContainerStarted","Data":"70f9e6266342c5f967016a931a5380710b2d362852d9bfff7e16aded747b79ed"} Mar 14 06:28:37 crc kubenswrapper[4713]: I0314 06:28:36.190268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmvfw" event={"ID":"e412202e-9dd7-4ebb-90a4-c25cbf3241b8","Type":"ContainerStarted","Data":"d0cb35c834233ed0f21efdb1f96cfaad45315f5fe661c6f92cfe7a71bb5a0e81"} Mar 14 06:28:37 crc kubenswrapper[4713]: I0314 06:28:37.083741 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:28:37 crc kubenswrapper[4713]: I0314 06:28:37.083794 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.013525 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded" start-of-body= Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.013974 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": context deadline exceeded" Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.045158 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded" start-of-body= Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.045228 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded" Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.058790 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.058852 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.058966 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:28:38 crc kubenswrapper[4713]: I0314 06:28:38.058993 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:28:39 crc kubenswrapper[4713]: I0314 06:28:39.143566 4713 generic.go:334] "Generic (PLEG): container finished" podID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerID="d0cb35c834233ed0f21efdb1f96cfaad45315f5fe661c6f92cfe7a71bb5a0e81" exitCode=0 Mar 14 06:28:39 crc kubenswrapper[4713]: I0314 06:28:39.143633 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmvfw" event={"ID":"e412202e-9dd7-4ebb-90a4-c25cbf3241b8","Type":"ContainerDied","Data":"d0cb35c834233ed0f21efdb1f96cfaad45315f5fe661c6f92cfe7a71bb5a0e81"} Mar 14 06:28:39 crc kubenswrapper[4713]: I0314 06:28:39.146855 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:28:47 crc kubenswrapper[4713]: I0314 06:28:47.366606 4713 generic.go:334] "Generic (PLEG): container finished" podID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerID="e8f7a724050473872e148e484471c644f9c4fb81ca17bdac7f609549eccb9d3b" exitCode=0 Mar 14 06:28:47 crc kubenswrapper[4713]: I0314 06:28:47.366720 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmvfw" event={"ID":"e412202e-9dd7-4ebb-90a4-c25cbf3241b8","Type":"ContainerDied","Data":"e8f7a724050473872e148e484471c644f9c4fb81ca17bdac7f609549eccb9d3b"} Mar 14 06:28:48 crc kubenswrapper[4713]: I0314 06:28:48.386651 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmvfw" event={"ID":"e412202e-9dd7-4ebb-90a4-c25cbf3241b8","Type":"ContainerStarted","Data":"4b29a79d4edbeba8056fcf97f82c2b374fc178fafea175c494fd432eb73e8ff3"} Mar 14 06:28:48 crc kubenswrapper[4713]: I0314 06:28:48.418676 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmvfw" podStartSLOduration=6.566290099 podStartE2EDuration="15.418660923s" podCreationTimestamp="2026-03-14 06:28:33 +0000 UTC" firstStartedPulling="2026-03-14 06:28:39.1465497 +0000 UTC m=+3702.234459000" lastFinishedPulling="2026-03-14 06:28:47.998920524 +0000 UTC m=+3711.086829824" observedRunningTime="2026-03-14 06:28:48.411503048 +0000 UTC m=+3711.499412348" watchObservedRunningTime="2026-03-14 06:28:48.418660923 +0000 UTC m=+3711.506570223" Mar 14 06:28:52 crc kubenswrapper[4713]: I0314 06:28:52.807136 4713 scope.go:117] "RemoveContainer" containerID="620b664642bb05289a04ea12e6d18801f6573e82f01963780dd3441dfa84755a" Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.062862 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.063286 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.130410 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.514615 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmvfw" Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.624937 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmvfw"] Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.677779 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 06:28:54 crc kubenswrapper[4713]: I0314 06:28:54.677999 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mgcsn" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="registry-server" containerID="cri-o://a20dd678b38e40c192cb2b14c7558caf95cd04333fa8aa371946fc600ec1f42e" gracePeriod=2 Mar 14 06:28:55 crc kubenswrapper[4713]: I0314 06:28:55.468221 4713 generic.go:334] "Generic (PLEG): container finished" podID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerID="a20dd678b38e40c192cb2b14c7558caf95cd04333fa8aa371946fc600ec1f42e" exitCode=0 Mar 14 06:28:55 crc kubenswrapper[4713]: I0314 06:28:55.468293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerDied","Data":"a20dd678b38e40c192cb2b14c7558caf95cd04333fa8aa371946fc600ec1f42e"} Mar 14 06:28:55 crc kubenswrapper[4713]: I0314 06:28:55.891471 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.003289 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities\") pod \"b28e03d1-af1f-4f04-ac10-91fce1fde925\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.003497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content\") pod \"b28e03d1-af1f-4f04-ac10-91fce1fde925\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.003560 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qz6j\" (UniqueName: \"kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j\") pod \"b28e03d1-af1f-4f04-ac10-91fce1fde925\" (UID: \"b28e03d1-af1f-4f04-ac10-91fce1fde925\") " Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.003579 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities" (OuterVolumeSpecName: "utilities") pod "b28e03d1-af1f-4f04-ac10-91fce1fde925" (UID: "b28e03d1-af1f-4f04-ac10-91fce1fde925"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.004318 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.033250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j" (OuterVolumeSpecName: "kube-api-access-6qz6j") pod "b28e03d1-af1f-4f04-ac10-91fce1fde925" (UID: "b28e03d1-af1f-4f04-ac10-91fce1fde925"). InnerVolumeSpecName "kube-api-access-6qz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.107130 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qz6j\" (UniqueName: \"kubernetes.io/projected/b28e03d1-af1f-4f04-ac10-91fce1fde925-kube-api-access-6qz6j\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.109355 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b28e03d1-af1f-4f04-ac10-91fce1fde925" (UID: "b28e03d1-af1f-4f04-ac10-91fce1fde925"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.210465 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b28e03d1-af1f-4f04-ac10-91fce1fde925-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.480688 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgcsn" event={"ID":"b28e03d1-af1f-4f04-ac10-91fce1fde925","Type":"ContainerDied","Data":"5616bc0c6a831d71f62de264dee036ae64c9a2a681529f2bb08ff79989847518"} Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.480745 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgcsn" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.480985 4713 scope.go:117] "RemoveContainer" containerID="a20dd678b38e40c192cb2b14c7558caf95cd04333fa8aa371946fc600ec1f42e" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.524162 4713 scope.go:117] "RemoveContainer" containerID="94461d3537515d3237d886b0f3c41389a1169c6966cc506f56798a98cde352de" Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.524793 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.535574 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mgcsn"] Mar 14 06:28:56 crc kubenswrapper[4713]: I0314 06:28:56.556874 4713 scope.go:117] "RemoveContainer" containerID="47edbe4e560c90a5e421fc848956b7dbad838bfffeaa11754ad0e257af9a6aa6" Mar 14 06:28:57 crc kubenswrapper[4713]: I0314 06:28:57.579909 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" path="/var/lib/kubelet/pods/b28e03d1-af1f-4f04-ac10-91fce1fde925/volumes" Mar 14 06:28:59 crc kubenswrapper[4713]: E0314 06:28:59.115971 4713 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:54468->38.102.83.106:35023: write tcp 38.102.83.106:54468->38.102.83.106:35023: write: broken pipe Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.154653 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt"] Mar 14 06:30:00 crc kubenswrapper[4713]: E0314 06:30:00.158253 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="extract-content" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.158370 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="extract-content" Mar 14 06:30:00 crc kubenswrapper[4713]: E0314 06:30:00.158463 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="extract-utilities" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.158540 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="extract-utilities" Mar 14 06:30:00 crc kubenswrapper[4713]: E0314 06:30:00.158618 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="registry-server" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.158695 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="registry-server" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.159287 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28e03d1-af1f-4f04-ac10-91fce1fde925" containerName="registry-server" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.160901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.168341 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.168895 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.170010 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557830-kfkcp"] Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.172676 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.175868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.176308 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.178678 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557830-kfkcp"] Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.179021 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.189422 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt"] Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.306576 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.306755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbqk\" (UniqueName: \"kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk\") pod \"auto-csr-approver-29557830-kfkcp\" (UID: \"0cc3d796-1079-44bb-9c1b-47c7aab13fe4\") " pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.307526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.307615 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vkmw\" (UniqueName: \"kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.411216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbqk\" (UniqueName: \"kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk\") pod \"auto-csr-approver-29557830-kfkcp\" (UID: \"0cc3d796-1079-44bb-9c1b-47c7aab13fe4\") " pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.411906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.412038 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vkmw\" (UniqueName: \"kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.412302 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.413102 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.431416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.436254 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vkmw\" (UniqueName: \"kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw\") pod \"collect-profiles-29557830-8flzt\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.436902 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbqk\" (UniqueName: \"kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk\") pod \"auto-csr-approver-29557830-kfkcp\" (UID: \"0cc3d796-1079-44bb-9c1b-47c7aab13fe4\") " pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.548849 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:00 crc kubenswrapper[4713]: I0314 06:30:00.561422 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:01 crc kubenswrapper[4713]: I0314 06:30:01.137819 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt"] Mar 14 06:30:01 crc kubenswrapper[4713]: I0314 06:30:01.197833 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" event={"ID":"bb531f8c-c460-4477-a1be-26640a8cea40","Type":"ContainerStarted","Data":"2d9b88e402793101069663199f15d2839ac5456a43c3a94e2540f286d361f957"} Mar 14 06:30:01 crc kubenswrapper[4713]: I0314 06:30:01.235899 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557830-kfkcp"] Mar 14 06:30:01 crc kubenswrapper[4713]: W0314 06:30:01.243903 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc3d796_1079_44bb_9c1b_47c7aab13fe4.slice/crio-217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48 WatchSource:0}: Error finding container 217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48: Status 404 returned error can't find the container with id 217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48 Mar 14 06:30:02 crc kubenswrapper[4713]: I0314 06:30:02.211027 4713 generic.go:334] "Generic (PLEG): container finished" podID="bb531f8c-c460-4477-a1be-26640a8cea40" containerID="a7f79758e81a6492f66974d8884a1505d3baf6d94f72f7e20fbe74e0a1395a15" exitCode=0 Mar 14 06:30:02 crc kubenswrapper[4713]: I0314 06:30:02.211445 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" event={"ID":"bb531f8c-c460-4477-a1be-26640a8cea40","Type":"ContainerDied","Data":"a7f79758e81a6492f66974d8884a1505d3baf6d94f72f7e20fbe74e0a1395a15"} Mar 14 06:30:02 crc kubenswrapper[4713]: I0314 06:30:02.212476 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" event={"ID":"0cc3d796-1079-44bb-9c1b-47c7aab13fe4","Type":"ContainerStarted","Data":"217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48"} Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.722113 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.879344 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vkmw\" (UniqueName: \"kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw\") pod \"bb531f8c-c460-4477-a1be-26640a8cea40\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.879474 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume\") pod \"bb531f8c-c460-4477-a1be-26640a8cea40\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.879660 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume\") pod \"bb531f8c-c460-4477-a1be-26640a8cea40\" (UID: \"bb531f8c-c460-4477-a1be-26640a8cea40\") " Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.880935 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb531f8c-c460-4477-a1be-26640a8cea40" (UID: "bb531f8c-c460-4477-a1be-26640a8cea40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.887378 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb531f8c-c460-4477-a1be-26640a8cea40" (UID: "bb531f8c-c460-4477-a1be-26640a8cea40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.887557 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw" (OuterVolumeSpecName: "kube-api-access-7vkmw") pod "bb531f8c-c460-4477-a1be-26640a8cea40" (UID: "bb531f8c-c460-4477-a1be-26640a8cea40"). InnerVolumeSpecName "kube-api-access-7vkmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.983068 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb531f8c-c460-4477-a1be-26640a8cea40-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.983285 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vkmw\" (UniqueName: \"kubernetes.io/projected/bb531f8c-c460-4477-a1be-26640a8cea40-kube-api-access-7vkmw\") on node \"crc\" DevicePath \"\"" Mar 14 06:30:03 crc kubenswrapper[4713]: I0314 06:30:03.983371 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb531f8c-c460-4477-a1be-26640a8cea40-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:30:04 crc kubenswrapper[4713]: I0314 06:30:04.251113 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" event={"ID":"bb531f8c-c460-4477-a1be-26640a8cea40","Type":"ContainerDied","Data":"2d9b88e402793101069663199f15d2839ac5456a43c3a94e2540f286d361f957"} Mar 14 06:30:04 crc kubenswrapper[4713]: I0314 06:30:04.251465 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9b88e402793101069663199f15d2839ac5456a43c3a94e2540f286d361f957" Mar 14 06:30:04 crc kubenswrapper[4713]: I0314 06:30:04.251196 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt" Mar 14 06:30:04 crc kubenswrapper[4713]: I0314 06:30:04.819253 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h"] Mar 14 06:30:04 crc kubenswrapper[4713]: I0314 06:30:04.839869 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557785-pc69h"] Mar 14 06:30:05 crc kubenswrapper[4713]: I0314 06:30:05.619834 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13aef23-d004-42c0-9e55-1e350f6cd1b0" path="/var/lib/kubelet/pods/a13aef23-d004-42c0-9e55-1e350f6cd1b0/volumes" Mar 14 06:30:06 crc kubenswrapper[4713]: I0314 06:30:06.274085 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" event={"ID":"0cc3d796-1079-44bb-9c1b-47c7aab13fe4","Type":"ContainerStarted","Data":"dfc245dd3c9e38bb7e9018e01254c04cff4a8a7f66accef2e4622b275f1334cf"} Mar 14 06:30:06 crc kubenswrapper[4713]: I0314 06:30:06.293003 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" podStartSLOduration=2.034298585 podStartE2EDuration="6.292981032s" podCreationTimestamp="2026-03-14 06:30:00 +0000 UTC" firstStartedPulling="2026-03-14 06:30:01.24876003 +0000 UTC m=+3784.336669330" lastFinishedPulling="2026-03-14 06:30:05.507442477 +0000 UTC m=+3788.595351777" observedRunningTime="2026-03-14 06:30:06.285501668 +0000 UTC m=+3789.373410978" watchObservedRunningTime="2026-03-14 06:30:06.292981032 +0000 UTC m=+3789.380890332" Mar 14 06:30:07 crc kubenswrapper[4713]: I0314 06:30:07.287688 4713 generic.go:334] "Generic (PLEG): container finished" podID="0cc3d796-1079-44bb-9c1b-47c7aab13fe4" containerID="dfc245dd3c9e38bb7e9018e01254c04cff4a8a7f66accef2e4622b275f1334cf" exitCode=0 Mar 14 06:30:07 crc kubenswrapper[4713]: I0314 06:30:07.287731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" event={"ID":"0cc3d796-1079-44bb-9c1b-47c7aab13fe4","Type":"ContainerDied","Data":"dfc245dd3c9e38bb7e9018e01254c04cff4a8a7f66accef2e4622b275f1334cf"} Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.318969 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" event={"ID":"0cc3d796-1079-44bb-9c1b-47c7aab13fe4","Type":"ContainerDied","Data":"217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48"} Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.320693 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217cf96a76447485d2a2cb3fde3673c36f4fc035487cc1629d6cb0d0dbe74d48" Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.358353 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.373599 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwbqk\" (UniqueName: \"kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk\") pod \"0cc3d796-1079-44bb-9c1b-47c7aab13fe4\" (UID: \"0cc3d796-1079-44bb-9c1b-47c7aab13fe4\") " Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.392089 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk" (OuterVolumeSpecName: "kube-api-access-xwbqk") pod "0cc3d796-1079-44bb-9c1b-47c7aab13fe4" (UID: "0cc3d796-1079-44bb-9c1b-47c7aab13fe4"). InnerVolumeSpecName "kube-api-access-xwbqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:30:09 crc kubenswrapper[4713]: I0314 06:30:09.478928 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwbqk\" (UniqueName: \"kubernetes.io/projected/0cc3d796-1079-44bb-9c1b-47c7aab13fe4-kube-api-access-xwbqk\") on node \"crc\" DevicePath \"\"" Mar 14 06:30:10 crc kubenswrapper[4713]: I0314 06:30:10.329280 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557830-kfkcp" Mar 14 06:30:10 crc kubenswrapper[4713]: I0314 06:30:10.483431 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557824-m429t"] Mar 14 06:30:10 crc kubenswrapper[4713]: I0314 06:30:10.501705 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557824-m429t"] Mar 14 06:30:11 crc kubenswrapper[4713]: I0314 06:30:11.590497 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5b8637-afeb-47e9-95c4-5d69b6833b45" path="/var/lib/kubelet/pods/2e5b8637-afeb-47e9-95c4-5d69b6833b45/volumes" Mar 14 06:30:40 crc kubenswrapper[4713]: I0314 06:30:40.732191 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:30:40 crc kubenswrapper[4713]: I0314 06:30:40.732823 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:30:53 crc kubenswrapper[4713]: I0314 06:30:53.287656 4713 scope.go:117] "RemoveContainer" containerID="29316001ef44b54dbf842e94e245ce492c9ba896bc4f52b7d0f37c23004ecdec" Mar 14 06:30:53 crc kubenswrapper[4713]: I0314 06:30:53.336147 4713 scope.go:117] "RemoveContainer" containerID="5281632e9ca61337a6079843fc3681d7575b5026e9ae4878809b4e03fb81a9a0" Mar 14 06:31:10 crc kubenswrapper[4713]: I0314 06:31:10.731592 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:31:10 crc kubenswrapper[4713]: I0314 06:31:10.732236 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:31:40 crc kubenswrapper[4713]: I0314 06:31:40.732281 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:31:40 crc kubenswrapper[4713]: I0314 06:31:40.732906 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:31:40 crc kubenswrapper[4713]: I0314 06:31:40.732975 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:31:40 crc kubenswrapper[4713]: I0314 06:31:40.734033 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:31:40 crc kubenswrapper[4713]: I0314 06:31:40.734092 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" gracePeriod=600 Mar 14 06:31:40 crc kubenswrapper[4713]: E0314 06:31:40.855071 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:31:41 crc kubenswrapper[4713]: I0314 06:31:41.830952 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" exitCode=0 Mar 14 06:31:41 crc kubenswrapper[4713]: I0314 06:31:41.831028 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387"} Mar 14 06:31:41 crc kubenswrapper[4713]: I0314 06:31:41.831300 4713 scope.go:117] "RemoveContainer" containerID="67da5416042dba6c94a45a0cee4cf85ae09af5e4e6ed64834715198736da7a6c" Mar 14 06:31:41 crc kubenswrapper[4713]: I0314 06:31:41.831783 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:31:41 crc kubenswrapper[4713]: E0314 06:31:41.832286 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:31:55 crc kubenswrapper[4713]: I0314 06:31:55.563806 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:31:55 crc kubenswrapper[4713]: E0314 06:31:55.564549 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.144465 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557832-mszng"] Mar 14 06:32:00 crc kubenswrapper[4713]: E0314 06:32:00.145478 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc3d796-1079-44bb-9c1b-47c7aab13fe4" containerName="oc" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.145494 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc3d796-1079-44bb-9c1b-47c7aab13fe4" containerName="oc" Mar 14 06:32:00 crc kubenswrapper[4713]: E0314 06:32:00.145528 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb531f8c-c460-4477-a1be-26640a8cea40" containerName="collect-profiles" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.145538 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb531f8c-c460-4477-a1be-26640a8cea40" containerName="collect-profiles" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.145803 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb531f8c-c460-4477-a1be-26640a8cea40" containerName="collect-profiles" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.145817 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc3d796-1079-44bb-9c1b-47c7aab13fe4" containerName="oc" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.146731 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.150842 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.151176 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.151480 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.167871 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557832-mszng"] Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.265111 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsxs\" (UniqueName: \"kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs\") pod \"auto-csr-approver-29557832-mszng\" (UID: \"59751ee7-09b0-469a-8f13-4070b096c60e\") " pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.367615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsxs\" (UniqueName: \"kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs\") pod \"auto-csr-approver-29557832-mszng\" (UID: \"59751ee7-09b0-469a-8f13-4070b096c60e\") " pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.393294 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsxs\" (UniqueName: \"kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs\") pod \"auto-csr-approver-29557832-mszng\" (UID: \"59751ee7-09b0-469a-8f13-4070b096c60e\") " pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.470176 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:00 crc kubenswrapper[4713]: I0314 06:32:00.954357 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557832-mszng"] Mar 14 06:32:01 crc kubenswrapper[4713]: I0314 06:32:01.276876 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557832-mszng" event={"ID":"59751ee7-09b0-469a-8f13-4070b096c60e","Type":"ContainerStarted","Data":"dd0f3aefcfe583588deebe5b207b2527dfe115900074f36cd494cc770e11bc44"} Mar 14 06:32:03 crc kubenswrapper[4713]: I0314 06:32:03.299255 4713 generic.go:334] "Generic (PLEG): container finished" podID="59751ee7-09b0-469a-8f13-4070b096c60e" containerID="938634140a338183571fe3eb83b2773480cc9e51964be9c119ab5abfaac1b413" exitCode=0 Mar 14 06:32:03 crc kubenswrapper[4713]: I0314 06:32:03.299552 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557832-mszng" event={"ID":"59751ee7-09b0-469a-8f13-4070b096c60e","Type":"ContainerDied","Data":"938634140a338183571fe3eb83b2773480cc9e51964be9c119ab5abfaac1b413"} Mar 14 06:32:04 crc kubenswrapper[4713]: I0314 06:32:04.728136 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:04 crc kubenswrapper[4713]: I0314 06:32:04.859517 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsxs\" (UniqueName: \"kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs\") pod \"59751ee7-09b0-469a-8f13-4070b096c60e\" (UID: \"59751ee7-09b0-469a-8f13-4070b096c60e\") " Mar 14 06:32:04 crc kubenswrapper[4713]: I0314 06:32:04.868411 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs" (OuterVolumeSpecName: "kube-api-access-6xsxs") pod "59751ee7-09b0-469a-8f13-4070b096c60e" (UID: "59751ee7-09b0-469a-8f13-4070b096c60e"). InnerVolumeSpecName "kube-api-access-6xsxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:32:04 crc kubenswrapper[4713]: I0314 06:32:04.963230 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsxs\" (UniqueName: \"kubernetes.io/projected/59751ee7-09b0-469a-8f13-4070b096c60e-kube-api-access-6xsxs\") on node \"crc\" DevicePath \"\"" Mar 14 06:32:05 crc kubenswrapper[4713]: I0314 06:32:05.323755 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557832-mszng" event={"ID":"59751ee7-09b0-469a-8f13-4070b096c60e","Type":"ContainerDied","Data":"dd0f3aefcfe583588deebe5b207b2527dfe115900074f36cd494cc770e11bc44"} Mar 14 06:32:05 crc kubenswrapper[4713]: I0314 06:32:05.323794 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0f3aefcfe583588deebe5b207b2527dfe115900074f36cd494cc770e11bc44" Mar 14 06:32:05 crc kubenswrapper[4713]: I0314 06:32:05.323824 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557832-mszng" Mar 14 06:32:05 crc kubenswrapper[4713]: I0314 06:32:05.810696 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557826-mbqgx"] Mar 14 06:32:05 crc kubenswrapper[4713]: I0314 06:32:05.822598 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557826-mbqgx"] Mar 14 06:32:07 crc kubenswrapper[4713]: I0314 06:32:07.575923 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:32:07 crc kubenswrapper[4713]: E0314 06:32:07.576418 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:32:07 crc kubenswrapper[4713]: I0314 06:32:07.582726 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88a3916-44fd-4e2d-a480-daa6e3c65230" path="/var/lib/kubelet/pods/c88a3916-44fd-4e2d-a480-daa6e3c65230/volumes" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.787083 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:13 crc kubenswrapper[4713]: E0314 06:32:13.788751 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59751ee7-09b0-469a-8f13-4070b096c60e" containerName="oc" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.788769 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="59751ee7-09b0-469a-8f13-4070b096c60e" containerName="oc" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.789050 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="59751ee7-09b0-469a-8f13-4070b096c60e" containerName="oc" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.792730 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.800654 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.862346 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkgv9\" (UniqueName: \"kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.862471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.862560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.964332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.964446 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.964564 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkgv9\" (UniqueName: \"kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.964793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.964874 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:13 crc kubenswrapper[4713]: I0314 06:32:13.988584 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkgv9\" (UniqueName: \"kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9\") pod \"redhat-marketplace-vp42r\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:14 crc kubenswrapper[4713]: I0314 06:32:14.128935 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:14 crc kubenswrapper[4713]: I0314 06:32:14.663853 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:15 crc kubenswrapper[4713]: I0314 06:32:15.128877 4713 generic.go:334] "Generic (PLEG): container finished" podID="d207a86a-b22e-482c-b823-febd5af64f3a" containerID="78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c" exitCode=0 Mar 14 06:32:15 crc kubenswrapper[4713]: I0314 06:32:15.129037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerDied","Data":"78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c"} Mar 14 06:32:15 crc kubenswrapper[4713]: I0314 06:32:15.129167 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerStarted","Data":"8df2eb838e44f85b400006b026b2076b305e39eae383831de80809e127f82e21"} Mar 14 06:32:17 crc kubenswrapper[4713]: I0314 06:32:17.151283 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerStarted","Data":"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9"} Mar 14 06:32:20 crc kubenswrapper[4713]: I0314 06:32:20.184823 4713 generic.go:334] "Generic (PLEG): container finished" podID="d207a86a-b22e-482c-b823-febd5af64f3a" containerID="f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9" exitCode=0 Mar 14 06:32:20 crc kubenswrapper[4713]: I0314 06:32:20.185610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerDied","Data":"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9"} Mar 14 06:32:22 crc kubenswrapper[4713]: I0314 06:32:22.563832 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:32:22 crc kubenswrapper[4713]: E0314 06:32:22.564594 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:32:23 crc kubenswrapper[4713]: I0314 06:32:23.228938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerStarted","Data":"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747"} Mar 14 06:32:23 crc kubenswrapper[4713]: I0314 06:32:23.258607 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vp42r" podStartSLOduration=2.468708129 podStartE2EDuration="10.258582617s" podCreationTimestamp="2026-03-14 06:32:13 +0000 UTC" firstStartedPulling="2026-03-14 06:32:15.131191209 +0000 UTC m=+3918.219100499" lastFinishedPulling="2026-03-14 06:32:22.921065687 +0000 UTC m=+3926.008974987" observedRunningTime="2026-03-14 06:32:23.247800559 +0000 UTC m=+3926.335709869" watchObservedRunningTime="2026-03-14 06:32:23.258582617 +0000 UTC m=+3926.346491917" Mar 14 06:32:24 crc kubenswrapper[4713]: I0314 06:32:24.129485 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:24 crc kubenswrapper[4713]: I0314 06:32:24.129745 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:25 crc kubenswrapper[4713]: I0314 06:32:25.179502 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vp42r" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="registry-server" probeResult="failure" output=< Mar 14 06:32:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:32:25 crc kubenswrapper[4713]: > Mar 14 06:32:34 crc kubenswrapper[4713]: I0314 06:32:34.181481 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:34 crc kubenswrapper[4713]: I0314 06:32:34.235446 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:34 crc kubenswrapper[4713]: I0314 06:32:34.422394 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:35 crc kubenswrapper[4713]: I0314 06:32:35.359354 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vp42r" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="registry-server" containerID="cri-o://bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747" gracePeriod=2 Mar 14 06:32:35 crc kubenswrapper[4713]: I0314 06:32:35.563926 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:32:35 crc kubenswrapper[4713]: E0314 06:32:35.564621 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.115312 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.311470 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkgv9\" (UniqueName: \"kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9\") pod \"d207a86a-b22e-482c-b823-febd5af64f3a\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.311519 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content\") pod \"d207a86a-b22e-482c-b823-febd5af64f3a\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.311808 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities\") pod \"d207a86a-b22e-482c-b823-febd5af64f3a\" (UID: \"d207a86a-b22e-482c-b823-febd5af64f3a\") " Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.312703 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities" (OuterVolumeSpecName: "utilities") pod "d207a86a-b22e-482c-b823-febd5af64f3a" (UID: "d207a86a-b22e-482c-b823-febd5af64f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.326999 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9" (OuterVolumeSpecName: "kube-api-access-tkgv9") pod "d207a86a-b22e-482c-b823-febd5af64f3a" (UID: "d207a86a-b22e-482c-b823-febd5af64f3a"). InnerVolumeSpecName "kube-api-access-tkgv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.343859 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d207a86a-b22e-482c-b823-febd5af64f3a" (UID: "d207a86a-b22e-482c-b823-febd5af64f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.388860 4713 generic.go:334] "Generic (PLEG): container finished" podID="d207a86a-b22e-482c-b823-febd5af64f3a" containerID="bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747" exitCode=0 Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.388914 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerDied","Data":"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747"} Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.388947 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp42r" event={"ID":"d207a86a-b22e-482c-b823-febd5af64f3a","Type":"ContainerDied","Data":"8df2eb838e44f85b400006b026b2076b305e39eae383831de80809e127f82e21"} Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.388964 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp42r" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.388975 4713 scope.go:117] "RemoveContainer" containerID="bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.425305 4713 scope.go:117] "RemoveContainer" containerID="f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.447028 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.447073 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkgv9\" (UniqueName: \"kubernetes.io/projected/d207a86a-b22e-482c-b823-febd5af64f3a-kube-api-access-tkgv9\") on node \"crc\" DevicePath \"\"" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.447091 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d207a86a-b22e-482c-b823-febd5af64f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.471662 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.478361 4713 scope.go:117] "RemoveContainer" containerID="78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.484044 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp42r"] Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.521878 4713 scope.go:117] "RemoveContainer" containerID="bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747" Mar 14 06:32:36 crc kubenswrapper[4713]: E0314 06:32:36.522441 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747\": container with ID starting with bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747 not found: ID does not exist" containerID="bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.522479 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747"} err="failed to get container status \"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747\": rpc error: code = NotFound desc = could not find container \"bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747\": container with ID starting with bd5355825429b1aeb85631a811a5cd10c2b9091b65c8b8a25db50b23cabb2747 not found: ID does not exist" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.522503 4713 scope.go:117] "RemoveContainer" containerID="f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9" Mar 14 06:32:36 crc kubenswrapper[4713]: E0314 06:32:36.522958 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9\": container with ID starting with f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9 not found: ID does not exist" containerID="f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.522987 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9"} err="failed to get container status \"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9\": rpc error: code = NotFound desc = could not find container \"f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9\": container with ID starting with f86ec5a7e8ba5c3bfdc3aeadfa55a52b608a863b319723ce72313353303767e9 not found: ID does not exist" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.523004 4713 scope.go:117] "RemoveContainer" containerID="78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c" Mar 14 06:32:36 crc kubenswrapper[4713]: E0314 06:32:36.523289 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c\": container with ID starting with 78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c not found: ID does not exist" containerID="78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c" Mar 14 06:32:36 crc kubenswrapper[4713]: I0314 06:32:36.523342 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c"} err="failed to get container status \"78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c\": rpc error: code = NotFound desc = could not find container \"78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c\": container with ID starting with 78d479a10f32c9e99eb399cc0eb29b164a6445b4bddc4495ab06e43e91478f4c not found: ID does not exist" Mar 14 06:32:37 crc kubenswrapper[4713]: I0314 06:32:37.580496 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" path="/var/lib/kubelet/pods/d207a86a-b22e-482c-b823-febd5af64f3a/volumes" Mar 14 06:32:50 crc kubenswrapper[4713]: I0314 06:32:50.564100 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:32:50 crc kubenswrapper[4713]: E0314 06:32:50.564994 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:32:53 crc kubenswrapper[4713]: I0314 06:32:53.510348 4713 scope.go:117] "RemoveContainer" containerID="204c9c5165fb3b19b239e26b0df833e3ed2f773f1bf30fa69f1131df1bb19193" Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.895697 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" start-of-body= Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.896532 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.899659 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:33:03 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:33:03 crc kubenswrapper[4713]: > Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.900917 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.900956 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:33:03 crc kubenswrapper[4713]: I0314 06:33:03.966128 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:33:03 crc kubenswrapper[4713]: E0314 06:33:03.967096 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:33:17 crc kubenswrapper[4713]: I0314 06:33:17.572051 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:33:17 crc kubenswrapper[4713]: E0314 06:33:17.572895 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:33:30 crc kubenswrapper[4713]: I0314 06:33:30.564232 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:33:30 crc kubenswrapper[4713]: E0314 06:33:30.565115 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:33:41 crc kubenswrapper[4713]: I0314 06:33:41.563424 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:33:41 crc kubenswrapper[4713]: E0314 06:33:41.565340 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:33:54 crc kubenswrapper[4713]: I0314 06:33:54.564105 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:33:54 crc kubenswrapper[4713]: E0314 06:33:54.565161 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.147407 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557834-nrlml"] Mar 14 06:34:00 crc kubenswrapper[4713]: E0314 06:34:00.148409 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="extract-content" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.148427 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="extract-content" Mar 14 06:34:00 crc kubenswrapper[4713]: E0314 06:34:00.148439 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="registry-server" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.148447 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="registry-server" Mar 14 06:34:00 crc kubenswrapper[4713]: E0314 06:34:00.148506 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="extract-utilities" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.148517 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="extract-utilities" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.148786 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d207a86a-b22e-482c-b823-febd5af64f3a" containerName="registry-server" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.149656 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.152955 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.153159 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.153647 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.159335 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557834-nrlml"] Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.206101 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4hbp\" (UniqueName: \"kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp\") pod \"auto-csr-approver-29557834-nrlml\" (UID: \"c047064b-149c-47fe-8c68-d6de8f0c9bd6\") " pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.309412 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4hbp\" (UniqueName: \"kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp\") pod \"auto-csr-approver-29557834-nrlml\" (UID: \"c047064b-149c-47fe-8c68-d6de8f0c9bd6\") " pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.332543 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4hbp\" (UniqueName: \"kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp\") pod \"auto-csr-approver-29557834-nrlml\" (UID: \"c047064b-149c-47fe-8c68-d6de8f0c9bd6\") " pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:00 crc kubenswrapper[4713]: I0314 06:34:00.467084 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:01 crc kubenswrapper[4713]: I0314 06:34:01.034510 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557834-nrlml"] Mar 14 06:34:01 crc kubenswrapper[4713]: I0314 06:34:01.042486 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:34:02 crc kubenswrapper[4713]: I0314 06:34:02.011809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557834-nrlml" event={"ID":"c047064b-149c-47fe-8c68-d6de8f0c9bd6","Type":"ContainerStarted","Data":"49e26572ef3219d95ae5d8ab0f0efa0c202d5a7815b33621186de118c183b64b"} Mar 14 06:34:04 crc kubenswrapper[4713]: I0314 06:34:04.032464 4713 generic.go:334] "Generic (PLEG): container finished" podID="c047064b-149c-47fe-8c68-d6de8f0c9bd6" containerID="f20c94dfab89fffeaab54f26c36eaa3f56908e6b7f1a2e96ef6e832f16011c73" exitCode=0 Mar 14 06:34:04 crc kubenswrapper[4713]: I0314 06:34:04.032566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557834-nrlml" event={"ID":"c047064b-149c-47fe-8c68-d6de8f0c9bd6","Type":"ContainerDied","Data":"f20c94dfab89fffeaab54f26c36eaa3f56908e6b7f1a2e96ef6e832f16011c73"} Mar 14 06:34:05 crc kubenswrapper[4713]: I0314 06:34:05.519514 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:05 crc kubenswrapper[4713]: I0314 06:34:05.639887 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4hbp\" (UniqueName: \"kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp\") pod \"c047064b-149c-47fe-8c68-d6de8f0c9bd6\" (UID: \"c047064b-149c-47fe-8c68-d6de8f0c9bd6\") " Mar 14 06:34:05 crc kubenswrapper[4713]: I0314 06:34:05.647031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp" (OuterVolumeSpecName: "kube-api-access-s4hbp") pod "c047064b-149c-47fe-8c68-d6de8f0c9bd6" (UID: "c047064b-149c-47fe-8c68-d6de8f0c9bd6"). InnerVolumeSpecName "kube-api-access-s4hbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:34:05 crc kubenswrapper[4713]: I0314 06:34:05.743474 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4hbp\" (UniqueName: \"kubernetes.io/projected/c047064b-149c-47fe-8c68-d6de8f0c9bd6-kube-api-access-s4hbp\") on node \"crc\" DevicePath \"\"" Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.053040 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557834-nrlml" event={"ID":"c047064b-149c-47fe-8c68-d6de8f0c9bd6","Type":"ContainerDied","Data":"49e26572ef3219d95ae5d8ab0f0efa0c202d5a7815b33621186de118c183b64b"} Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.053086 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e26572ef3219d95ae5d8ab0f0efa0c202d5a7815b33621186de118c183b64b" Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.053105 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557834-nrlml" Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.569883 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:34:06 crc kubenswrapper[4713]: E0314 06:34:06.570746 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.618785 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557828-2nlzd"] Mar 14 06:34:06 crc kubenswrapper[4713]: I0314 06:34:06.630918 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557828-2nlzd"] Mar 14 06:34:07 crc kubenswrapper[4713]: I0314 06:34:07.579229 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1807a6a6-3794-4fab-bdc7-17c83fa3fdeb" path="/var/lib/kubelet/pods/1807a6a6-3794-4fab-bdc7-17c83fa3fdeb/volumes" Mar 14 06:34:21 crc kubenswrapper[4713]: I0314 06:34:21.563706 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:34:21 crc kubenswrapper[4713]: E0314 06:34:21.564622 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.160755 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:34:31 crc kubenswrapper[4713]: E0314 06:34:31.164396 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c047064b-149c-47fe-8c68-d6de8f0c9bd6" containerName="oc" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.164606 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c047064b-149c-47fe-8c68-d6de8f0c9bd6" containerName="oc" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.165272 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c047064b-149c-47fe-8c68-d6de8f0c9bd6" containerName="oc" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.170686 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.175670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.320478 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7x4\" (UniqueName: \"kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.320578 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.320683 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.423613 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7x4\" (UniqueName: \"kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.424031 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.424175 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.424638 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.424775 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.453041 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7x4\" (UniqueName: \"kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4\") pod \"redhat-operators-m8dwg\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.506125 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:31 crc kubenswrapper[4713]: I0314 06:34:31.990146 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:34:32 crc kubenswrapper[4713]: I0314 06:34:32.371470 4713 generic.go:334] "Generic (PLEG): container finished" podID="67857300-0f55-4338-99f4-52f33d491a09" containerID="eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4" exitCode=0 Mar 14 06:34:32 crc kubenswrapper[4713]: I0314 06:34:32.371556 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerDied","Data":"eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4"} Mar 14 06:34:32 crc kubenswrapper[4713]: I0314 06:34:32.371646 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerStarted","Data":"1d2e8bb63ed9564f8da39e257238dcd200006022a4cceb2759144906de35fdbf"} Mar 14 06:34:33 crc kubenswrapper[4713]: I0314 06:34:33.384556 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerStarted","Data":"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b"} Mar 14 06:34:34 crc kubenswrapper[4713]: I0314 06:34:34.564176 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:34:34 crc kubenswrapper[4713]: E0314 06:34:34.564744 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:34:40 crc kubenswrapper[4713]: I0314 06:34:40.460768 4713 generic.go:334] "Generic (PLEG): container finished" podID="67857300-0f55-4338-99f4-52f33d491a09" containerID="bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b" exitCode=0 Mar 14 06:34:40 crc kubenswrapper[4713]: I0314 06:34:40.460870 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerDied","Data":"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b"} Mar 14 06:34:41 crc kubenswrapper[4713]: I0314 06:34:41.472640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerStarted","Data":"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8"} Mar 14 06:34:41 crc kubenswrapper[4713]: I0314 06:34:41.496583 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m8dwg" podStartSLOduration=1.878070803 podStartE2EDuration="10.496566492s" podCreationTimestamp="2026-03-14 06:34:31 +0000 UTC" firstStartedPulling="2026-03-14 06:34:32.374720783 +0000 UTC m=+4055.462630083" lastFinishedPulling="2026-03-14 06:34:40.993216482 +0000 UTC m=+4064.081125772" observedRunningTime="2026-03-14 06:34:41.489729107 +0000 UTC m=+4064.577638407" watchObservedRunningTime="2026-03-14 06:34:41.496566492 +0000 UTC m=+4064.584475792" Mar 14 06:34:41 crc kubenswrapper[4713]: I0314 06:34:41.507712 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:41 crc kubenswrapper[4713]: I0314 06:34:41.507757 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:34:42 crc kubenswrapper[4713]: I0314 06:34:42.551361 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:34:42 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:34:42 crc kubenswrapper[4713]: > Mar 14 06:34:48 crc kubenswrapper[4713]: I0314 06:34:48.565179 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:34:48 crc kubenswrapper[4713]: E0314 06:34:48.566561 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:34:52 crc kubenswrapper[4713]: I0314 06:34:52.575045 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:34:52 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:34:52 crc kubenswrapper[4713]: > Mar 14 06:34:53 crc kubenswrapper[4713]: I0314 06:34:53.646939 4713 scope.go:117] "RemoveContainer" containerID="eaa021c6f943b25fcdcdc9073daad184cb58b392754fd96ddfef12c6f0a9b43f" Mar 14 06:35:00 crc kubenswrapper[4713]: I0314 06:35:00.565447 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:35:00 crc kubenswrapper[4713]: E0314 06:35:00.566533 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:35:02 crc kubenswrapper[4713]: I0314 06:35:02.555939 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:35:02 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:35:02 crc kubenswrapper[4713]: > Mar 14 06:35:12 crc kubenswrapper[4713]: I0314 06:35:12.556086 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:35:12 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:35:12 crc kubenswrapper[4713]: > Mar 14 06:35:13 crc kubenswrapper[4713]: I0314 06:35:13.563712 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:35:13 crc kubenswrapper[4713]: E0314 06:35:13.564278 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:35:22 crc kubenswrapper[4713]: I0314 06:35:22.553344 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:35:22 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:35:22 crc kubenswrapper[4713]: > Mar 14 06:35:28 crc kubenswrapper[4713]: I0314 06:35:28.591852 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:35:28 crc kubenswrapper[4713]: E0314 06:35:28.593536 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:35:32 crc kubenswrapper[4713]: I0314 06:35:32.557413 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" probeResult="failure" output=< Mar 14 06:35:32 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:35:32 crc kubenswrapper[4713]: > Mar 14 06:35:41 crc kubenswrapper[4713]: I0314 06:35:41.556172 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:35:41 crc kubenswrapper[4713]: I0314 06:35:41.635700 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:35:41 crc kubenswrapper[4713]: I0314 06:35:41.807765 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:35:42 crc kubenswrapper[4713]: I0314 06:35:42.831602 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:35:42 crc kubenswrapper[4713]: E0314 06:35:42.847044 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.144823 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m8dwg" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" containerID="cri-o://4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8" gracePeriod=2 Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.777966 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.978620 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content\") pod \"67857300-0f55-4338-99f4-52f33d491a09\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.978831 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities\") pod \"67857300-0f55-4338-99f4-52f33d491a09\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.978914 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7x4\" (UniqueName: \"kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4\") pod \"67857300-0f55-4338-99f4-52f33d491a09\" (UID: \"67857300-0f55-4338-99f4-52f33d491a09\") " Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.980055 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities" (OuterVolumeSpecName: "utilities") pod "67857300-0f55-4338-99f4-52f33d491a09" (UID: "67857300-0f55-4338-99f4-52f33d491a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:35:43 crc kubenswrapper[4713]: I0314 06:35:43.994595 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4" (OuterVolumeSpecName: "kube-api-access-tz7x4") pod "67857300-0f55-4338-99f4-52f33d491a09" (UID: "67857300-0f55-4338-99f4-52f33d491a09"). InnerVolumeSpecName "kube-api-access-tz7x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.080773 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.080806 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz7x4\" (UniqueName: \"kubernetes.io/projected/67857300-0f55-4338-99f4-52f33d491a09-kube-api-access-tz7x4\") on node \"crc\" DevicePath \"\"" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.160119 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67857300-0f55-4338-99f4-52f33d491a09" (UID: "67857300-0f55-4338-99f4-52f33d491a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.163375 4713 generic.go:334] "Generic (PLEG): container finished" podID="67857300-0f55-4338-99f4-52f33d491a09" containerID="4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8" exitCode=0 Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.163429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerDied","Data":"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8"} Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.163463 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8dwg" event={"ID":"67857300-0f55-4338-99f4-52f33d491a09","Type":"ContainerDied","Data":"1d2e8bb63ed9564f8da39e257238dcd200006022a4cceb2759144906de35fdbf"} Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.163482 4713 scope.go:117] "RemoveContainer" containerID="4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.163637 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8dwg" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.183665 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67857300-0f55-4338-99f4-52f33d491a09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.199245 4713 scope.go:117] "RemoveContainer" containerID="bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.206635 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.234138 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m8dwg"] Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.237120 4713 scope.go:117] "RemoveContainer" containerID="eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.289436 4713 scope.go:117] "RemoveContainer" containerID="4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8" Mar 14 06:35:44 crc kubenswrapper[4713]: E0314 06:35:44.289944 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8\": container with ID starting with 4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8 not found: ID does not exist" containerID="4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.289988 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8"} err="failed to get container status \"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8\": rpc error: code = NotFound desc = could not find container \"4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8\": container with ID starting with 4acfee35f07878afe66d78a97463917bdecdf71eca2a0f262ee81200708582b8 not found: ID does not exist" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.290015 4713 scope.go:117] "RemoveContainer" containerID="bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b" Mar 14 06:35:44 crc kubenswrapper[4713]: E0314 06:35:44.290452 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b\": container with ID starting with bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b not found: ID does not exist" containerID="bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.290480 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b"} err="failed to get container status \"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b\": rpc error: code = NotFound desc = could not find container \"bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b\": container with ID starting with bbbe50b2da4d88c077d3e5dc7129787e69d4319170a5b1d3dc7102217235c77b not found: ID does not exist" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.290495 4713 scope.go:117] "RemoveContainer" containerID="eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4" Mar 14 06:35:44 crc kubenswrapper[4713]: E0314 06:35:44.290730 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4\": container with ID starting with eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4 not found: ID does not exist" containerID="eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4" Mar 14 06:35:44 crc kubenswrapper[4713]: I0314 06:35:44.290752 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4"} err="failed to get container status \"eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4\": rpc error: code = NotFound desc = could not find container \"eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4\": container with ID starting with eca1be7ca8cd8298e0c8a6bf9d91eeea686dad86a18400bf94247da8ac65e7e4 not found: ID does not exist" Mar 14 06:35:45 crc kubenswrapper[4713]: I0314 06:35:45.576305 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67857300-0f55-4338-99f4-52f33d491a09" path="/var/lib/kubelet/pods/67857300-0f55-4338-99f4-52f33d491a09/volumes" Mar 14 06:35:57 crc kubenswrapper[4713]: I0314 06:35:57.573496 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:35:57 crc kubenswrapper[4713]: E0314 06:35:57.574456 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.153800 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557836-zvct8"] Mar 14 06:36:00 crc kubenswrapper[4713]: E0314 06:36:00.154833 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="extract-utilities" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.154846 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="extract-utilities" Mar 14 06:36:00 crc kubenswrapper[4713]: E0314 06:36:00.154885 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.154892 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" Mar 14 06:36:00 crc kubenswrapper[4713]: E0314 06:36:00.154906 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="extract-content" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.154913 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="extract-content" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.155153 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="67857300-0f55-4338-99f4-52f33d491a09" containerName="registry-server" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.156146 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.158559 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.158807 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.158856 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.167406 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557836-zvct8"] Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.210234 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxvp\" (UniqueName: \"kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp\") pod \"auto-csr-approver-29557836-zvct8\" (UID: \"8df5874d-d429-4ddd-9172-96742cd80f47\") " pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.312444 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxvp\" (UniqueName: \"kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp\") pod \"auto-csr-approver-29557836-zvct8\" (UID: \"8df5874d-d429-4ddd-9172-96742cd80f47\") " pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.339084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxvp\" (UniqueName: \"kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp\") pod \"auto-csr-approver-29557836-zvct8\" (UID: \"8df5874d-d429-4ddd-9172-96742cd80f47\") " pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:00 crc kubenswrapper[4713]: I0314 06:36:00.495458 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:01 crc kubenswrapper[4713]: I0314 06:36:01.024805 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557836-zvct8"] Mar 14 06:36:01 crc kubenswrapper[4713]: I0314 06:36:01.349724 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557836-zvct8" event={"ID":"8df5874d-d429-4ddd-9172-96742cd80f47","Type":"ContainerStarted","Data":"0ecc77647cbb6d9be3603a61c70a9b5d08613be3e0cc8127f7982f6a5305b6da"} Mar 14 06:36:03 crc kubenswrapper[4713]: I0314 06:36:03.371159 4713 generic.go:334] "Generic (PLEG): container finished" podID="8df5874d-d429-4ddd-9172-96742cd80f47" containerID="421fad5f36a0cd00923afd9947150100726bc2fdf85dee8dc7c5e7fdb792b293" exitCode=0 Mar 14 06:36:03 crc kubenswrapper[4713]: I0314 06:36:03.371239 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557836-zvct8" event={"ID":"8df5874d-d429-4ddd-9172-96742cd80f47","Type":"ContainerDied","Data":"421fad5f36a0cd00923afd9947150100726bc2fdf85dee8dc7c5e7fdb792b293"} Mar 14 06:36:04 crc kubenswrapper[4713]: I0314 06:36:04.782730 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:04 crc kubenswrapper[4713]: I0314 06:36:04.826123 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkxvp\" (UniqueName: \"kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp\") pod \"8df5874d-d429-4ddd-9172-96742cd80f47\" (UID: \"8df5874d-d429-4ddd-9172-96742cd80f47\") " Mar 14 06:36:04 crc kubenswrapper[4713]: I0314 06:36:04.833077 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp" (OuterVolumeSpecName: "kube-api-access-tkxvp") pod "8df5874d-d429-4ddd-9172-96742cd80f47" (UID: "8df5874d-d429-4ddd-9172-96742cd80f47"). InnerVolumeSpecName "kube-api-access-tkxvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:36:04 crc kubenswrapper[4713]: I0314 06:36:04.929707 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkxvp\" (UniqueName: \"kubernetes.io/projected/8df5874d-d429-4ddd-9172-96742cd80f47-kube-api-access-tkxvp\") on node \"crc\" DevicePath \"\"" Mar 14 06:36:05 crc kubenswrapper[4713]: I0314 06:36:05.697573 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557836-zvct8" Mar 14 06:36:05 crc kubenswrapper[4713]: I0314 06:36:05.699752 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557836-zvct8" event={"ID":"8df5874d-d429-4ddd-9172-96742cd80f47","Type":"ContainerDied","Data":"0ecc77647cbb6d9be3603a61c70a9b5d08613be3e0cc8127f7982f6a5305b6da"} Mar 14 06:36:05 crc kubenswrapper[4713]: I0314 06:36:05.699815 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecc77647cbb6d9be3603a61c70a9b5d08613be3e0cc8127f7982f6a5305b6da" Mar 14 06:36:05 crc kubenswrapper[4713]: I0314 06:36:05.881694 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557830-kfkcp"] Mar 14 06:36:05 crc kubenswrapper[4713]: I0314 06:36:05.894540 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557830-kfkcp"] Mar 14 06:36:08 crc kubenswrapper[4713]: I0314 06:36:08.042897 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc3d796-1079-44bb-9c1b-47c7aab13fe4" path="/var/lib/kubelet/pods/0cc3d796-1079-44bb-9c1b-47c7aab13fe4/volumes" Mar 14 06:36:11 crc kubenswrapper[4713]: I0314 06:36:11.564040 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:36:11 crc kubenswrapper[4713]: E0314 06:36:11.565116 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:36:25 crc kubenswrapper[4713]: I0314 06:36:25.564420 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:36:25 crc kubenswrapper[4713]: E0314 06:36:25.565174 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:36:36 crc kubenswrapper[4713]: I0314 06:36:36.564478 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:36:36 crc kubenswrapper[4713]: E0314 06:36:36.565292 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:36:49 crc kubenswrapper[4713]: I0314 06:36:49.574132 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:36:50 crc kubenswrapper[4713]: I0314 06:36:50.137314 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154"} Mar 14 06:36:53 crc kubenswrapper[4713]: I0314 06:36:53.784240 4713 scope.go:117] "RemoveContainer" containerID="dfc245dd3c9e38bb7e9018e01254c04cff4a8a7f66accef2e4622b275f1334cf" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.144097 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557838-n68cv"] Mar 14 06:38:00 crc kubenswrapper[4713]: E0314 06:38:00.145093 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df5874d-d429-4ddd-9172-96742cd80f47" containerName="oc" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.145106 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df5874d-d429-4ddd-9172-96742cd80f47" containerName="oc" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.145346 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df5874d-d429-4ddd-9172-96742cd80f47" containerName="oc" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.146192 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.148482 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.148741 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.148789 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.153763 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557838-n68cv"] Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.321686 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9cdl\" (UniqueName: \"kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl\") pod \"auto-csr-approver-29557838-n68cv\" (UID: \"43491cff-9721-4697-94ec-135986e04b5d\") " pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.423969 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9cdl\" (UniqueName: \"kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl\") pod \"auto-csr-approver-29557838-n68cv\" (UID: \"43491cff-9721-4697-94ec-135986e04b5d\") " pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.447337 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9cdl\" (UniqueName: \"kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl\") pod \"auto-csr-approver-29557838-n68cv\" (UID: \"43491cff-9721-4697-94ec-135986e04b5d\") " pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:00 crc kubenswrapper[4713]: I0314 06:38:00.479783 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:01 crc kubenswrapper[4713]: I0314 06:38:01.018744 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557838-n68cv"] Mar 14 06:38:01 crc kubenswrapper[4713]: I0314 06:38:01.919550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557838-n68cv" event={"ID":"43491cff-9721-4697-94ec-135986e04b5d","Type":"ContainerStarted","Data":"68e4cddd3ff64e94b9112f5ca5b1a27956e78425595cd00544d47a5d3e5968e9"} Mar 14 06:38:02 crc kubenswrapper[4713]: E0314 06:38:02.671802 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43491cff_9721_4697_94ec_135986e04b5d.slice/crio-13f3fa4d309f4e7dd8c9013cb5a00ce18c9e9bc24916da2d8d9013b50d1a7940.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43491cff_9721_4697_94ec_135986e04b5d.slice/crio-conmon-13f3fa4d309f4e7dd8c9013cb5a00ce18c9e9bc24916da2d8d9013b50d1a7940.scope\": RecentStats: unable to find data in memory cache]" Mar 14 06:38:02 crc kubenswrapper[4713]: I0314 06:38:02.931287 4713 generic.go:334] "Generic (PLEG): container finished" podID="43491cff-9721-4697-94ec-135986e04b5d" containerID="13f3fa4d309f4e7dd8c9013cb5a00ce18c9e9bc24916da2d8d9013b50d1a7940" exitCode=0 Mar 14 06:38:02 crc kubenswrapper[4713]: I0314 06:38:02.931360 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557838-n68cv" event={"ID":"43491cff-9721-4697-94ec-135986e04b5d","Type":"ContainerDied","Data":"13f3fa4d309f4e7dd8c9013cb5a00ce18c9e9bc24916da2d8d9013b50d1a7940"} Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.337150 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.438366 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9cdl\" (UniqueName: \"kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl\") pod \"43491cff-9721-4697-94ec-135986e04b5d\" (UID: \"43491cff-9721-4697-94ec-135986e04b5d\") " Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.444218 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl" (OuterVolumeSpecName: "kube-api-access-v9cdl") pod "43491cff-9721-4697-94ec-135986e04b5d" (UID: "43491cff-9721-4697-94ec-135986e04b5d"). InnerVolumeSpecName "kube-api-access-v9cdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.541856 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9cdl\" (UniqueName: \"kubernetes.io/projected/43491cff-9721-4697-94ec-135986e04b5d-kube-api-access-v9cdl\") on node \"crc\" DevicePath \"\"" Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.954902 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557838-n68cv" event={"ID":"43491cff-9721-4697-94ec-135986e04b5d","Type":"ContainerDied","Data":"68e4cddd3ff64e94b9112f5ca5b1a27956e78425595cd00544d47a5d3e5968e9"} Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.954945 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e4cddd3ff64e94b9112f5ca5b1a27956e78425595cd00544d47a5d3e5968e9" Mar 14 06:38:04 crc kubenswrapper[4713]: I0314 06:38:04.955019 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557838-n68cv" Mar 14 06:38:05 crc kubenswrapper[4713]: I0314 06:38:05.406678 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557832-mszng"] Mar 14 06:38:05 crc kubenswrapper[4713]: I0314 06:38:05.418982 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557832-mszng"] Mar 14 06:38:05 crc kubenswrapper[4713]: I0314 06:38:05.577298 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59751ee7-09b0-469a-8f13-4070b096c60e" path="/var/lib/kubelet/pods/59751ee7-09b0-469a-8f13-4070b096c60e/volumes" Mar 14 06:38:53 crc kubenswrapper[4713]: I0314 06:38:53.918689 4713 scope.go:117] "RemoveContainer" containerID="938634140a338183571fe3eb83b2773480cc9e51964be9c119ab5abfaac1b413" Mar 14 06:39:10 crc kubenswrapper[4713]: I0314 06:39:10.731280 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:39:10 crc kubenswrapper[4713]: I0314 06:39:10.731959 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:39:40 crc kubenswrapper[4713]: I0314 06:39:40.731552 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:39:40 crc kubenswrapper[4713]: I0314 06:39:40.732072 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.144758 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557840-b7dwd"] Mar 14 06:40:00 crc kubenswrapper[4713]: E0314 06:40:00.145707 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43491cff-9721-4697-94ec-135986e04b5d" containerName="oc" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.145720 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="43491cff-9721-4697-94ec-135986e04b5d" containerName="oc" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.145992 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="43491cff-9721-4697-94ec-135986e04b5d" containerName="oc" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.146840 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.149333 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.149551 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.170085 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557840-b7dwd"] Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.189577 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vcp\" (UniqueName: \"kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp\") pod \"auto-csr-approver-29557840-b7dwd\" (UID: \"2868b35f-86fe-4b44-98d7-8d5a429938f3\") " pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.192595 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.291826 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vcp\" (UniqueName: \"kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp\") pod \"auto-csr-approver-29557840-b7dwd\" (UID: \"2868b35f-86fe-4b44-98d7-8d5a429938f3\") " pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.310905 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vcp\" (UniqueName: \"kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp\") pod \"auto-csr-approver-29557840-b7dwd\" (UID: \"2868b35f-86fe-4b44-98d7-8d5a429938f3\") " pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.510159 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:00 crc kubenswrapper[4713]: I0314 06:40:00.998894 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557840-b7dwd"] Mar 14 06:40:01 crc kubenswrapper[4713]: I0314 06:40:01.006957 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:40:01 crc kubenswrapper[4713]: I0314 06:40:01.222663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" event={"ID":"2868b35f-86fe-4b44-98d7-8d5a429938f3","Type":"ContainerStarted","Data":"b77b428bae0a198176e4300f53346c20b1c9c1a508ee0bbb93bfa80c8aac9cf1"} Mar 14 06:40:02 crc kubenswrapper[4713]: I0314 06:40:02.236009 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" event={"ID":"2868b35f-86fe-4b44-98d7-8d5a429938f3","Type":"ContainerStarted","Data":"289dd2588fddaf3371179da8d1bdb7a81c3845268272a86d9abcc9f0cad2fbc9"} Mar 14 06:40:02 crc kubenswrapper[4713]: I0314 06:40:02.250317 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" podStartSLOduration=1.2994183910000001 podStartE2EDuration="2.25030036s" podCreationTimestamp="2026-03-14 06:40:00 +0000 UTC" firstStartedPulling="2026-03-14 06:40:01.006772347 +0000 UTC m=+4384.094681647" lastFinishedPulling="2026-03-14 06:40:01.957654316 +0000 UTC m=+4385.045563616" observedRunningTime="2026-03-14 06:40:02.24964351 +0000 UTC m=+4385.337552810" watchObservedRunningTime="2026-03-14 06:40:02.25030036 +0000 UTC m=+4385.338209660" Mar 14 06:40:03 crc kubenswrapper[4713]: I0314 06:40:03.248744 4713 generic.go:334] "Generic (PLEG): container finished" podID="2868b35f-86fe-4b44-98d7-8d5a429938f3" containerID="289dd2588fddaf3371179da8d1bdb7a81c3845268272a86d9abcc9f0cad2fbc9" exitCode=0 Mar 14 06:40:03 crc kubenswrapper[4713]: I0314 06:40:03.248838 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" event={"ID":"2868b35f-86fe-4b44-98d7-8d5a429938f3","Type":"ContainerDied","Data":"289dd2588fddaf3371179da8d1bdb7a81c3845268272a86d9abcc9f0cad2fbc9"} Mar 14 06:40:04 crc kubenswrapper[4713]: I0314 06:40:04.676889 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:04 crc kubenswrapper[4713]: I0314 06:40:04.725499 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vcp\" (UniqueName: \"kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp\") pod \"2868b35f-86fe-4b44-98d7-8d5a429938f3\" (UID: \"2868b35f-86fe-4b44-98d7-8d5a429938f3\") " Mar 14 06:40:04 crc kubenswrapper[4713]: I0314 06:40:04.731944 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp" (OuterVolumeSpecName: "kube-api-access-p2vcp") pod "2868b35f-86fe-4b44-98d7-8d5a429938f3" (UID: "2868b35f-86fe-4b44-98d7-8d5a429938f3"). InnerVolumeSpecName "kube-api-access-p2vcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:40:04 crc kubenswrapper[4713]: I0314 06:40:04.828157 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vcp\" (UniqueName: \"kubernetes.io/projected/2868b35f-86fe-4b44-98d7-8d5a429938f3-kube-api-access-p2vcp\") on node \"crc\" DevicePath \"\"" Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.297344 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" event={"ID":"2868b35f-86fe-4b44-98d7-8d5a429938f3","Type":"ContainerDied","Data":"b77b428bae0a198176e4300f53346c20b1c9c1a508ee0bbb93bfa80c8aac9cf1"} Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.297386 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b77b428bae0a198176e4300f53346c20b1c9c1a508ee0bbb93bfa80c8aac9cf1" Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.297437 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557840-b7dwd" Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.358714 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557834-nrlml"] Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.370368 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557834-nrlml"] Mar 14 06:40:05 crc kubenswrapper[4713]: I0314 06:40:05.576563 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c047064b-149c-47fe-8c68-d6de8f0c9bd6" path="/var/lib/kubelet/pods/c047064b-149c-47fe-8c68-d6de8f0c9bd6/volumes" Mar 14 06:40:10 crc kubenswrapper[4713]: I0314 06:40:10.731908 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:40:10 crc kubenswrapper[4713]: I0314 06:40:10.732574 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:40:10 crc kubenswrapper[4713]: I0314 06:40:10.732628 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:40:10 crc kubenswrapper[4713]: I0314 06:40:10.733813 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:40:10 crc kubenswrapper[4713]: I0314 06:40:10.733876 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154" gracePeriod=600 Mar 14 06:40:11 crc kubenswrapper[4713]: I0314 06:40:11.360049 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154" exitCode=0 Mar 14 06:40:11 crc kubenswrapper[4713]: I0314 06:40:11.360112 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154"} Mar 14 06:40:11 crc kubenswrapper[4713]: I0314 06:40:11.360465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60"} Mar 14 06:40:11 crc kubenswrapper[4713]: I0314 06:40:11.360487 4713 scope.go:117] "RemoveContainer" containerID="4737e6b64ec27826b2478ac214de2d999f80d6428063e07b52780065cad86387" Mar 14 06:40:54 crc kubenswrapper[4713]: I0314 06:40:54.042402 4713 scope.go:117] "RemoveContainer" containerID="f20c94dfab89fffeaab54f26c36eaa3f56908e6b7f1a2e96ef6e832f16011c73" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.860249 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:23 crc kubenswrapper[4713]: E0314 06:41:23.861424 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2868b35f-86fe-4b44-98d7-8d5a429938f3" containerName="oc" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.861444 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2868b35f-86fe-4b44-98d7-8d5a429938f3" containerName="oc" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.861806 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2868b35f-86fe-4b44-98d7-8d5a429938f3" containerName="oc" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.868089 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.879417 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.952112 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72ld7\" (UniqueName: \"kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.952242 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:23 crc kubenswrapper[4713]: I0314 06:41:23.952362 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054149 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054320 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054458 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72ld7\" (UniqueName: \"kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054654 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.054885 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.056763 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.067390 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.156989 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.157444 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.157520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhvk\" (UniqueName: \"kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.259754 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.260044 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.260084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhvk\" (UniqueName: \"kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.260318 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.260409 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.420158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72ld7\" (UniqueName: \"kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7\") pod \"community-operators-q2nwf\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.420857 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhvk\" (UniqueName: \"kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk\") pod \"certified-operators-7xg5z\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.491556 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:24 crc kubenswrapper[4713]: I0314 06:41:24.684614 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:25 crc kubenswrapper[4713]: I0314 06:41:25.096589 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:25 crc kubenswrapper[4713]: I0314 06:41:25.172663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerStarted","Data":"893cdec3718c8d1c820c8bc65a8add20a5a44f8df9c133295170af90df5a9216"} Mar 14 06:41:25 crc kubenswrapper[4713]: I0314 06:41:25.347995 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:25 crc kubenswrapper[4713]: W0314 06:41:25.386475 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f649dda_e14a_4096_9b44_64ed89330c56.slice/crio-0ea5751440b69fc23084521d4df87bcb5ac85dbba92d4335c26eebccd76b7477 WatchSource:0}: Error finding container 0ea5751440b69fc23084521d4df87bcb5ac85dbba92d4335c26eebccd76b7477: Status 404 returned error can't find the container with id 0ea5751440b69fc23084521d4df87bcb5ac85dbba92d4335c26eebccd76b7477 Mar 14 06:41:26 crc kubenswrapper[4713]: I0314 06:41:26.184962 4713 generic.go:334] "Generic (PLEG): container finished" podID="5f649dda-e14a-4096-9b44-64ed89330c56" containerID="567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263" exitCode=0 Mar 14 06:41:26 crc kubenswrapper[4713]: I0314 06:41:26.185030 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerDied","Data":"567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263"} Mar 14 06:41:26 crc kubenswrapper[4713]: I0314 06:41:26.185431 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerStarted","Data":"0ea5751440b69fc23084521d4df87bcb5ac85dbba92d4335c26eebccd76b7477"} Mar 14 06:41:26 crc kubenswrapper[4713]: I0314 06:41:26.187798 4713 generic.go:334] "Generic (PLEG): container finished" podID="cc201607-7946-4532-8888-1249fc35e279" containerID="eed48af7a194eb4bddc616de73d0bb6f57352872c8489e5a60fe9f4409ec0c5a" exitCode=0 Mar 14 06:41:26 crc kubenswrapper[4713]: I0314 06:41:26.187845 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerDied","Data":"eed48af7a194eb4bddc616de73d0bb6f57352872c8489e5a60fe9f4409ec0c5a"} Mar 14 06:41:27 crc kubenswrapper[4713]: I0314 06:41:27.198714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerStarted","Data":"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4"} Mar 14 06:41:28 crc kubenswrapper[4713]: I0314 06:41:28.213309 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerStarted","Data":"55c5e573695238c02c61fc1b8a5023f2138fe04d3f65885e7c0e89a4b40888e8"} Mar 14 06:41:29 crc kubenswrapper[4713]: I0314 06:41:29.234831 4713 generic.go:334] "Generic (PLEG): container finished" podID="5f649dda-e14a-4096-9b44-64ed89330c56" containerID="c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4" exitCode=0 Mar 14 06:41:29 crc kubenswrapper[4713]: I0314 06:41:29.234905 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerDied","Data":"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4"} Mar 14 06:41:30 crc kubenswrapper[4713]: I0314 06:41:30.253929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerStarted","Data":"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e"} Mar 14 06:41:30 crc kubenswrapper[4713]: I0314 06:41:30.256910 4713 generic.go:334] "Generic (PLEG): container finished" podID="cc201607-7946-4532-8888-1249fc35e279" containerID="55c5e573695238c02c61fc1b8a5023f2138fe04d3f65885e7c0e89a4b40888e8" exitCode=0 Mar 14 06:41:30 crc kubenswrapper[4713]: I0314 06:41:30.256990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerDied","Data":"55c5e573695238c02c61fc1b8a5023f2138fe04d3f65885e7c0e89a4b40888e8"} Mar 14 06:41:30 crc kubenswrapper[4713]: I0314 06:41:30.289187 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7xg5z" podStartSLOduration=2.816631827 podStartE2EDuration="6.289168869s" podCreationTimestamp="2026-03-14 06:41:24 +0000 UTC" firstStartedPulling="2026-03-14 06:41:26.187393082 +0000 UTC m=+4469.275302392" lastFinishedPulling="2026-03-14 06:41:29.659930134 +0000 UTC m=+4472.747839434" observedRunningTime="2026-03-14 06:41:30.276656928 +0000 UTC m=+4473.364566258" watchObservedRunningTime="2026-03-14 06:41:30.289168869 +0000 UTC m=+4473.377078169" Mar 14 06:41:31 crc kubenswrapper[4713]: I0314 06:41:31.278640 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerStarted","Data":"e3621e6333e477bc908ff5dc66a3fa78965ec7955486b8b75eb0bdeee1c3a434"} Mar 14 06:41:31 crc kubenswrapper[4713]: I0314 06:41:31.305108 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2nwf" podStartSLOduration=3.843352286 podStartE2EDuration="8.305089258s" podCreationTimestamp="2026-03-14 06:41:23 +0000 UTC" firstStartedPulling="2026-03-14 06:41:26.190380205 +0000 UTC m=+4469.278289506" lastFinishedPulling="2026-03-14 06:41:30.652117168 +0000 UTC m=+4473.740026478" observedRunningTime="2026-03-14 06:41:31.304301374 +0000 UTC m=+4474.392210674" watchObservedRunningTime="2026-03-14 06:41:31.305089258 +0000 UTC m=+4474.392998568" Mar 14 06:41:34 crc kubenswrapper[4713]: I0314 06:41:34.492882 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:34 crc kubenswrapper[4713]: I0314 06:41:34.495253 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:34 crc kubenswrapper[4713]: I0314 06:41:34.685154 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:34 crc kubenswrapper[4713]: I0314 06:41:34.685823 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:34 crc kubenswrapper[4713]: I0314 06:41:34.749451 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:35 crc kubenswrapper[4713]: I0314 06:41:35.555264 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:35 crc kubenswrapper[4713]: I0314 06:41:35.561416 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-q2nwf" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="registry-server" probeResult="failure" output=< Mar 14 06:41:35 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:41:35 crc kubenswrapper[4713]: > Mar 14 06:41:36 crc kubenswrapper[4713]: I0314 06:41:36.843405 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:37 crc kubenswrapper[4713]: I0314 06:41:37.402391 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7xg5z" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="registry-server" containerID="cri-o://22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e" gracePeriod=2 Mar 14 06:41:37 crc kubenswrapper[4713]: I0314 06:41:37.986990 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.140982 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content\") pod \"5f649dda-e14a-4096-9b44-64ed89330c56\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.141051 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlhvk\" (UniqueName: \"kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk\") pod \"5f649dda-e14a-4096-9b44-64ed89330c56\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.141219 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities\") pod \"5f649dda-e14a-4096-9b44-64ed89330c56\" (UID: \"5f649dda-e14a-4096-9b44-64ed89330c56\") " Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.141959 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities" (OuterVolumeSpecName: "utilities") pod "5f649dda-e14a-4096-9b44-64ed89330c56" (UID: "5f649dda-e14a-4096-9b44-64ed89330c56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.152158 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk" (OuterVolumeSpecName: "kube-api-access-nlhvk") pod "5f649dda-e14a-4096-9b44-64ed89330c56" (UID: "5f649dda-e14a-4096-9b44-64ed89330c56"). InnerVolumeSpecName "kube-api-access-nlhvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.194517 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f649dda-e14a-4096-9b44-64ed89330c56" (UID: "5f649dda-e14a-4096-9b44-64ed89330c56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.244771 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.244815 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlhvk\" (UniqueName: \"kubernetes.io/projected/5f649dda-e14a-4096-9b44-64ed89330c56-kube-api-access-nlhvk\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.244827 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f649dda-e14a-4096-9b44-64ed89330c56-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.421872 4713 generic.go:334] "Generic (PLEG): container finished" podID="5f649dda-e14a-4096-9b44-64ed89330c56" containerID="22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e" exitCode=0 Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.421927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerDied","Data":"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e"} Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.422337 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7xg5z" event={"ID":"5f649dda-e14a-4096-9b44-64ed89330c56","Type":"ContainerDied","Data":"0ea5751440b69fc23084521d4df87bcb5ac85dbba92d4335c26eebccd76b7477"} Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.422369 4713 scope.go:117] "RemoveContainer" containerID="22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.422031 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7xg5z" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.460518 4713 scope.go:117] "RemoveContainer" containerID="c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4" Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.481218 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:38 crc kubenswrapper[4713]: I0314 06:41:38.529349 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7xg5z"] Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.245817 4713 scope.go:117] "RemoveContainer" containerID="567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.301362 4713 scope.go:117] "RemoveContainer" containerID="22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e" Mar 14 06:41:39 crc kubenswrapper[4713]: E0314 06:41:39.302099 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e\": container with ID starting with 22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e not found: ID does not exist" containerID="22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.302156 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e"} err="failed to get container status \"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e\": rpc error: code = NotFound desc = could not find container \"22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e\": container with ID starting with 22a787a22387b6c2056fede140d518a5bae33994c8c102710e3bdcc52863ea0e not found: ID does not exist" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.302186 4713 scope.go:117] "RemoveContainer" containerID="c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4" Mar 14 06:41:39 crc kubenswrapper[4713]: E0314 06:41:39.303004 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4\": container with ID starting with c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4 not found: ID does not exist" containerID="c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.303036 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4"} err="failed to get container status \"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4\": rpc error: code = NotFound desc = could not find container \"c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4\": container with ID starting with c4dda993e4b63cb2de76c97eee5fc843dbaca3eb50e3fcfb3eb6dd4b516b78b4 not found: ID does not exist" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.303067 4713 scope.go:117] "RemoveContainer" containerID="567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263" Mar 14 06:41:39 crc kubenswrapper[4713]: E0314 06:41:39.303322 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263\": container with ID starting with 567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263 not found: ID does not exist" containerID="567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.303354 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263"} err="failed to get container status \"567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263\": rpc error: code = NotFound desc = could not find container \"567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263\": container with ID starting with 567ad4fc98f7505a718fce7b9de46960c9ac5c7de8a96851e63e8a04b0f1f263 not found: ID does not exist" Mar 14 06:41:39 crc kubenswrapper[4713]: I0314 06:41:39.576366 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" path="/var/lib/kubelet/pods/5f649dda-e14a-4096-9b44-64ed89330c56/volumes" Mar 14 06:41:44 crc kubenswrapper[4713]: I0314 06:41:44.541483 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:44 crc kubenswrapper[4713]: I0314 06:41:44.608036 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:45 crc kubenswrapper[4713]: I0314 06:41:45.439337 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:46 crc kubenswrapper[4713]: I0314 06:41:46.514479 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2nwf" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="registry-server" containerID="cri-o://e3621e6333e477bc908ff5dc66a3fa78965ec7955486b8b75eb0bdeee1c3a434" gracePeriod=2 Mar 14 06:41:47 crc kubenswrapper[4713]: I0314 06:41:47.531822 4713 generic.go:334] "Generic (PLEG): container finished" podID="cc201607-7946-4532-8888-1249fc35e279" containerID="e3621e6333e477bc908ff5dc66a3fa78965ec7955486b8b75eb0bdeee1c3a434" exitCode=0 Mar 14 06:41:47 crc kubenswrapper[4713]: I0314 06:41:47.531925 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerDied","Data":"e3621e6333e477bc908ff5dc66a3fa78965ec7955486b8b75eb0bdeee1c3a434"} Mar 14 06:41:47 crc kubenswrapper[4713]: I0314 06:41:47.859819 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.036804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities\") pod \"cc201607-7946-4532-8888-1249fc35e279\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.036941 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72ld7\" (UniqueName: \"kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7\") pod \"cc201607-7946-4532-8888-1249fc35e279\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.037025 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content\") pod \"cc201607-7946-4532-8888-1249fc35e279\" (UID: \"cc201607-7946-4532-8888-1249fc35e279\") " Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.040747 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities" (OuterVolumeSpecName: "utilities") pod "cc201607-7946-4532-8888-1249fc35e279" (UID: "cc201607-7946-4532-8888-1249fc35e279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.048164 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7" (OuterVolumeSpecName: "kube-api-access-72ld7") pod "cc201607-7946-4532-8888-1249fc35e279" (UID: "cc201607-7946-4532-8888-1249fc35e279"). InnerVolumeSpecName "kube-api-access-72ld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.112444 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc201607-7946-4532-8888-1249fc35e279" (UID: "cc201607-7946-4532-8888-1249fc35e279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.139686 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.139722 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72ld7\" (UniqueName: \"kubernetes.io/projected/cc201607-7946-4532-8888-1249fc35e279-kube-api-access-72ld7\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.139732 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc201607-7946-4532-8888-1249fc35e279-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.546899 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2nwf" event={"ID":"cc201607-7946-4532-8888-1249fc35e279","Type":"ContainerDied","Data":"893cdec3718c8d1c820c8bc65a8add20a5a44f8df9c133295170af90df5a9216"} Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.546976 4713 scope.go:117] "RemoveContainer" containerID="e3621e6333e477bc908ff5dc66a3fa78965ec7955486b8b75eb0bdeee1c3a434" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.547076 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2nwf" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.584177 4713 scope.go:117] "RemoveContainer" containerID="55c5e573695238c02c61fc1b8a5023f2138fe04d3f65885e7c0e89a4b40888e8" Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.587182 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:48 crc kubenswrapper[4713]: I0314 06:41:48.598674 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2nwf"] Mar 14 06:41:49 crc kubenswrapper[4713]: I0314 06:41:49.276741 4713 scope.go:117] "RemoveContainer" containerID="eed48af7a194eb4bddc616de73d0bb6f57352872c8489e5a60fe9f4409ec0c5a" Mar 14 06:41:49 crc kubenswrapper[4713]: I0314 06:41:49.582293 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc201607-7946-4532-8888-1249fc35e279" path="/var/lib/kubelet/pods/cc201607-7946-4532-8888-1249fc35e279/volumes" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.164190 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557842-5n4fj"] Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166353 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="extract-utilities" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166385 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="extract-utilities" Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166414 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="extract-utilities" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166426 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="extract-utilities" Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166446 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="extract-content" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166457 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="extract-content" Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166494 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166509 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166542 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="extract-content" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166553 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="extract-content" Mar 14 06:42:00 crc kubenswrapper[4713]: E0314 06:42:00.166578 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.166591 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.167097 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f649dda-e14a-4096-9b44-64ed89330c56" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.167144 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc201607-7946-4532-8888-1249fc35e279" containerName="registry-server" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.168955 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.171444 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.171465 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.171947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.190170 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557842-5n4fj"] Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.268712 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8c5d\" (UniqueName: \"kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d\") pod \"auto-csr-approver-29557842-5n4fj\" (UID: \"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb\") " pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.371706 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8c5d\" (UniqueName: \"kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d\") pod \"auto-csr-approver-29557842-5n4fj\" (UID: \"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb\") " pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.398457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8c5d\" (UniqueName: \"kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d\") pod \"auto-csr-approver-29557842-5n4fj\" (UID: \"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb\") " pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.498845 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:00 crc kubenswrapper[4713]: I0314 06:42:00.985838 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557842-5n4fj"] Mar 14 06:42:01 crc kubenswrapper[4713]: I0314 06:42:01.734131 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" event={"ID":"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb","Type":"ContainerStarted","Data":"d0fc64109cdf9c9818f37a1ca70a937f7695e044768f6c2dca0f06f173ef703a"} Mar 14 06:42:02 crc kubenswrapper[4713]: I0314 06:42:02.745815 4713 generic.go:334] "Generic (PLEG): container finished" podID="dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" containerID="f752b2648efd79bf5a8e96ba56bb6dd8596adfd12674ced1a6008c6a21d0c1f6" exitCode=0 Mar 14 06:42:02 crc kubenswrapper[4713]: I0314 06:42:02.745912 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" event={"ID":"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb","Type":"ContainerDied","Data":"f752b2648efd79bf5a8e96ba56bb6dd8596adfd12674ced1a6008c6a21d0c1f6"} Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.135434 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.275950 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8c5d\" (UniqueName: \"kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d\") pod \"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb\" (UID: \"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb\") " Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.283525 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d" (OuterVolumeSpecName: "kube-api-access-h8c5d") pod "dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" (UID: "dfe7c302-ae6e-4f31-8db8-65e062d4d9eb"). InnerVolumeSpecName "kube-api-access-h8c5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.379750 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8c5d\" (UniqueName: \"kubernetes.io/projected/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb-kube-api-access-h8c5d\") on node \"crc\" DevicePath \"\"" Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.768511 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" event={"ID":"dfe7c302-ae6e-4f31-8db8-65e062d4d9eb","Type":"ContainerDied","Data":"d0fc64109cdf9c9818f37a1ca70a937f7695e044768f6c2dca0f06f173ef703a"} Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.768569 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0fc64109cdf9c9818f37a1ca70a937f7695e044768f6c2dca0f06f173ef703a" Mar 14 06:42:04 crc kubenswrapper[4713]: I0314 06:42:04.768682 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557842-5n4fj" Mar 14 06:42:05 crc kubenswrapper[4713]: I0314 06:42:05.206170 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557836-zvct8"] Mar 14 06:42:05 crc kubenswrapper[4713]: I0314 06:42:05.219711 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557836-zvct8"] Mar 14 06:42:05 crc kubenswrapper[4713]: I0314 06:42:05.578571 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df5874d-d429-4ddd-9172-96742cd80f47" path="/var/lib/kubelet/pods/8df5874d-d429-4ddd-9172-96742cd80f47/volumes" Mar 14 06:42:40 crc kubenswrapper[4713]: I0314 06:42:40.731552 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:42:40 crc kubenswrapper[4713]: I0314 06:42:40.732117 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:42:54 crc kubenswrapper[4713]: I0314 06:42:54.390046 4713 scope.go:117] "RemoveContainer" containerID="421fad5f36a0cd00923afd9947150100726bc2fdf85dee8dc7c5e7fdb792b293" Mar 14 06:43:10 crc kubenswrapper[4713]: I0314 06:43:10.731159 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:43:10 crc kubenswrapper[4713]: I0314 06:43:10.731791 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.624774 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:31 crc kubenswrapper[4713]: E0314 06:43:31.625964 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" containerName="oc" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.625977 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" containerName="oc" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.626194 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" containerName="oc" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.628010 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.669630 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.780692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.780759 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bs5b\" (UniqueName: \"kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.780835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.884417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.884482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bs5b\" (UniqueName: \"kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.884560 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.885169 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.885302 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.916415 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bs5b\" (UniqueName: \"kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b\") pod \"redhat-marketplace-d259v\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:31 crc kubenswrapper[4713]: I0314 06:43:31.957158 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:32 crc kubenswrapper[4713]: I0314 06:43:32.463728 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:32 crc kubenswrapper[4713]: I0314 06:43:32.717956 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerStarted","Data":"a51667a076e8d8cc5a5e6bdb9aad6cb338ce4fc529d9ec5b6bcc910002032778"} Mar 14 06:43:33 crc kubenswrapper[4713]: I0314 06:43:33.730188 4713 generic.go:334] "Generic (PLEG): container finished" podID="8292815a-1929-4536-b840-a7613285d5c4" containerID="24843ca49d99913b86402fbaf11e6c646164d1fe1d262a92a626f411bde9effa" exitCode=0 Mar 14 06:43:33 crc kubenswrapper[4713]: I0314 06:43:33.730298 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerDied","Data":"24843ca49d99913b86402fbaf11e6c646164d1fe1d262a92a626f411bde9effa"} Mar 14 06:43:34 crc kubenswrapper[4713]: I0314 06:43:34.747999 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerStarted","Data":"d93879c8c4bcc0cb8c382f41eaf00a2e573ed207c8002afeaea1259f65d42f0b"} Mar 14 06:43:35 crc kubenswrapper[4713]: I0314 06:43:35.759477 4713 generic.go:334] "Generic (PLEG): container finished" podID="8292815a-1929-4536-b840-a7613285d5c4" containerID="d93879c8c4bcc0cb8c382f41eaf00a2e573ed207c8002afeaea1259f65d42f0b" exitCode=0 Mar 14 06:43:35 crc kubenswrapper[4713]: I0314 06:43:35.759568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerDied","Data":"d93879c8c4bcc0cb8c382f41eaf00a2e573ed207c8002afeaea1259f65d42f0b"} Mar 14 06:43:36 crc kubenswrapper[4713]: I0314 06:43:36.773576 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerStarted","Data":"088dc37edd22c006c9f5e352e4d57bdb596a38c1ccdd50304c18bb0a28a136dc"} Mar 14 06:43:36 crc kubenswrapper[4713]: I0314 06:43:36.796397 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d259v" podStartSLOduration=3.157156121 podStartE2EDuration="5.796376771s" podCreationTimestamp="2026-03-14 06:43:31 +0000 UTC" firstStartedPulling="2026-03-14 06:43:33.73275059 +0000 UTC m=+4596.820659890" lastFinishedPulling="2026-03-14 06:43:36.37197124 +0000 UTC m=+4599.459880540" observedRunningTime="2026-03-14 06:43:36.788609047 +0000 UTC m=+4599.876518347" watchObservedRunningTime="2026-03-14 06:43:36.796376771 +0000 UTC m=+4599.884286071" Mar 14 06:43:40 crc kubenswrapper[4713]: I0314 06:43:40.731446 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:43:40 crc kubenswrapper[4713]: I0314 06:43:40.731925 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:43:40 crc kubenswrapper[4713]: I0314 06:43:40.732033 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:43:40 crc kubenswrapper[4713]: I0314 06:43:40.732747 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:43:40 crc kubenswrapper[4713]: I0314 06:43:40.732791 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" gracePeriod=600 Mar 14 06:43:40 crc kubenswrapper[4713]: E0314 06:43:40.855671 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.825714 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" exitCode=0 Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.825908 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60"} Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.831164 4713 scope.go:117] "RemoveContainer" containerID="290e716be34b1e88381d9fb3675a5b2ee53c4b4de14c38b4b0ed9d78c0ef5154" Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.833643 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:43:41 crc kubenswrapper[4713]: E0314 06:43:41.834060 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.962419 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:41 crc kubenswrapper[4713]: I0314 06:43:41.962492 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:42 crc kubenswrapper[4713]: I0314 06:43:42.019853 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:42 crc kubenswrapper[4713]: I0314 06:43:42.898830 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:42 crc kubenswrapper[4713]: I0314 06:43:42.948927 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:44 crc kubenswrapper[4713]: I0314 06:43:44.862080 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d259v" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="registry-server" containerID="cri-o://088dc37edd22c006c9f5e352e4d57bdb596a38c1ccdd50304c18bb0a28a136dc" gracePeriod=2 Mar 14 06:43:45 crc kubenswrapper[4713]: I0314 06:43:45.874337 4713 generic.go:334] "Generic (PLEG): container finished" podID="8292815a-1929-4536-b840-a7613285d5c4" containerID="088dc37edd22c006c9f5e352e4d57bdb596a38c1ccdd50304c18bb0a28a136dc" exitCode=0 Mar 14 06:43:45 crc kubenswrapper[4713]: I0314 06:43:45.874465 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerDied","Data":"088dc37edd22c006c9f5e352e4d57bdb596a38c1ccdd50304c18bb0a28a136dc"} Mar 14 06:43:45 crc kubenswrapper[4713]: I0314 06:43:45.874674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d259v" event={"ID":"8292815a-1929-4536-b840-a7613285d5c4","Type":"ContainerDied","Data":"a51667a076e8d8cc5a5e6bdb9aad6cb338ce4fc529d9ec5b6bcc910002032778"} Mar 14 06:43:45 crc kubenswrapper[4713]: I0314 06:43:45.874690 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51667a076e8d8cc5a5e6bdb9aad6cb338ce4fc529d9ec5b6bcc910002032778" Mar 14 06:43:45 crc kubenswrapper[4713]: I0314 06:43:45.902507 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.049416 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content\") pod \"8292815a-1929-4536-b840-a7613285d5c4\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.050249 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities\") pod \"8292815a-1929-4536-b840-a7613285d5c4\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.050450 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bs5b\" (UniqueName: \"kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b\") pod \"8292815a-1929-4536-b840-a7613285d5c4\" (UID: \"8292815a-1929-4536-b840-a7613285d5c4\") " Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.050917 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities" (OuterVolumeSpecName: "utilities") pod "8292815a-1929-4536-b840-a7613285d5c4" (UID: "8292815a-1929-4536-b840-a7613285d5c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.051646 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.061344 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b" (OuterVolumeSpecName: "kube-api-access-6bs5b") pod "8292815a-1929-4536-b840-a7613285d5c4" (UID: "8292815a-1929-4536-b840-a7613285d5c4"). InnerVolumeSpecName "kube-api-access-6bs5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.080333 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8292815a-1929-4536-b840-a7613285d5c4" (UID: "8292815a-1929-4536-b840-a7613285d5c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.153508 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8292815a-1929-4536-b840-a7613285d5c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.153548 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bs5b\" (UniqueName: \"kubernetes.io/projected/8292815a-1929-4536-b840-a7613285d5c4-kube-api-access-6bs5b\") on node \"crc\" DevicePath \"\"" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.883745 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d259v" Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.920753 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:46 crc kubenswrapper[4713]: I0314 06:43:46.933611 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d259v"] Mar 14 06:43:47 crc kubenswrapper[4713]: I0314 06:43:47.605185 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8292815a-1929-4536-b840-a7613285d5c4" path="/var/lib/kubelet/pods/8292815a-1929-4536-b840-a7613285d5c4/volumes" Mar 14 06:43:52 crc kubenswrapper[4713]: I0314 06:43:52.566078 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:43:52 crc kubenswrapper[4713]: E0314 06:43:52.566849 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.141428 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557844-mgbbv"] Mar 14 06:44:00 crc kubenswrapper[4713]: E0314 06:44:00.142558 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="extract-content" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.142576 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="extract-content" Mar 14 06:44:00 crc kubenswrapper[4713]: E0314 06:44:00.142595 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="registry-server" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.142601 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="registry-server" Mar 14 06:44:00 crc kubenswrapper[4713]: E0314 06:44:00.142615 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="extract-utilities" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.142623 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="extract-utilities" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.142851 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8292815a-1929-4536-b840-a7613285d5c4" containerName="registry-server" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.144456 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.148284 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.148505 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.149000 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.164340 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557844-mgbbv"] Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.249111 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg6tm\" (UniqueName: \"kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm\") pod \"auto-csr-approver-29557844-mgbbv\" (UID: \"78d845ea-57e1-4693-a9c7-e202ce5ef771\") " pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.351387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6tm\" (UniqueName: \"kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm\") pod \"auto-csr-approver-29557844-mgbbv\" (UID: \"78d845ea-57e1-4693-a9c7-e202ce5ef771\") " pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:00 crc kubenswrapper[4713]: I0314 06:44:00.910630 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6tm\" (UniqueName: \"kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm\") pod \"auto-csr-approver-29557844-mgbbv\" (UID: \"78d845ea-57e1-4693-a9c7-e202ce5ef771\") " pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:01 crc kubenswrapper[4713]: I0314 06:44:01.102834 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:01 crc kubenswrapper[4713]: I0314 06:44:01.671844 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557844-mgbbv"] Mar 14 06:44:01 crc kubenswrapper[4713]: W0314 06:44:01.673996 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78d845ea_57e1_4693_a9c7_e202ce5ef771.slice/crio-c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c WatchSource:0}: Error finding container c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c: Status 404 returned error can't find the container with id c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c Mar 14 06:44:02 crc kubenswrapper[4713]: I0314 06:44:02.086184 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" event={"ID":"78d845ea-57e1-4693-a9c7-e202ce5ef771","Type":"ContainerStarted","Data":"c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c"} Mar 14 06:44:03 crc kubenswrapper[4713]: I0314 06:44:03.109741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" event={"ID":"78d845ea-57e1-4693-a9c7-e202ce5ef771","Type":"ContainerStarted","Data":"d008982ab67c6683ce14724277ac53d61d77e65e74d4eafd020c7f1995222ed4"} Mar 14 06:44:03 crc kubenswrapper[4713]: I0314 06:44:03.136031 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" podStartSLOduration=2.171929484 podStartE2EDuration="3.135987453s" podCreationTimestamp="2026-03-14 06:44:00 +0000 UTC" firstStartedPulling="2026-03-14 06:44:01.680896086 +0000 UTC m=+4624.768805386" lastFinishedPulling="2026-03-14 06:44:02.644954045 +0000 UTC m=+4625.732863355" observedRunningTime="2026-03-14 06:44:03.128354122 +0000 UTC m=+4626.216263422" watchObservedRunningTime="2026-03-14 06:44:03.135987453 +0000 UTC m=+4626.223896763" Mar 14 06:44:04 crc kubenswrapper[4713]: I0314 06:44:04.130318 4713 generic.go:334] "Generic (PLEG): container finished" podID="78d845ea-57e1-4693-a9c7-e202ce5ef771" containerID="d008982ab67c6683ce14724277ac53d61d77e65e74d4eafd020c7f1995222ed4" exitCode=0 Mar 14 06:44:04 crc kubenswrapper[4713]: I0314 06:44:04.130361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" event={"ID":"78d845ea-57e1-4693-a9c7-e202ce5ef771","Type":"ContainerDied","Data":"d008982ab67c6683ce14724277ac53d61d77e65e74d4eafd020c7f1995222ed4"} Mar 14 06:44:04 crc kubenswrapper[4713]: I0314 06:44:04.563895 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:44:04 crc kubenswrapper[4713]: E0314 06:44:04.564260 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:44:05 crc kubenswrapper[4713]: I0314 06:44:05.557291 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:05 crc kubenswrapper[4713]: I0314 06:44:05.705749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg6tm\" (UniqueName: \"kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm\") pod \"78d845ea-57e1-4693-a9c7-e202ce5ef771\" (UID: \"78d845ea-57e1-4693-a9c7-e202ce5ef771\") " Mar 14 06:44:05 crc kubenswrapper[4713]: I0314 06:44:05.712036 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm" (OuterVolumeSpecName: "kube-api-access-pg6tm") pod "78d845ea-57e1-4693-a9c7-e202ce5ef771" (UID: "78d845ea-57e1-4693-a9c7-e202ce5ef771"). InnerVolumeSpecName "kube-api-access-pg6tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:44:05 crc kubenswrapper[4713]: I0314 06:44:05.809483 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg6tm\" (UniqueName: \"kubernetes.io/projected/78d845ea-57e1-4693-a9c7-e202ce5ef771-kube-api-access-pg6tm\") on node \"crc\" DevicePath \"\"" Mar 14 06:44:06 crc kubenswrapper[4713]: I0314 06:44:06.151328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" event={"ID":"78d845ea-57e1-4693-a9c7-e202ce5ef771","Type":"ContainerDied","Data":"c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c"} Mar 14 06:44:06 crc kubenswrapper[4713]: I0314 06:44:06.151375 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3fe0be10478cb13abdc7cb8e0b657996df3651e00c45532b78142f263042d9c" Mar 14 06:44:06 crc kubenswrapper[4713]: I0314 06:44:06.151433 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557844-mgbbv" Mar 14 06:44:06 crc kubenswrapper[4713]: I0314 06:44:06.240675 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557838-n68cv"] Mar 14 06:44:06 crc kubenswrapper[4713]: I0314 06:44:06.255311 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557838-n68cv"] Mar 14 06:44:07 crc kubenswrapper[4713]: I0314 06:44:07.579872 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43491cff-9721-4697-94ec-135986e04b5d" path="/var/lib/kubelet/pods/43491cff-9721-4697-94ec-135986e04b5d/volumes" Mar 14 06:44:17 crc kubenswrapper[4713]: I0314 06:44:17.574496 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:44:17 crc kubenswrapper[4713]: E0314 06:44:17.575450 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:44:31 crc kubenswrapper[4713]: I0314 06:44:31.564256 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:44:31 crc kubenswrapper[4713]: E0314 06:44:31.565070 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:44:46 crc kubenswrapper[4713]: I0314 06:44:46.563583 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:44:46 crc kubenswrapper[4713]: E0314 06:44:46.564397 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:44:54 crc kubenswrapper[4713]: I0314 06:44:54.510922 4713 scope.go:117] "RemoveContainer" containerID="13f3fa4d309f4e7dd8c9013cb5a00ce18c9e9bc24916da2d8d9013b50d1a7940" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.146658 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd"] Mar 14 06:45:00 crc kubenswrapper[4713]: E0314 06:45:00.147637 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d845ea-57e1-4693-a9c7-e202ce5ef771" containerName="oc" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.147650 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d845ea-57e1-4693-a9c7-e202ce5ef771" containerName="oc" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.147900 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d845ea-57e1-4693-a9c7-e202ce5ef771" containerName="oc" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.148640 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.152434 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.152543 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.165037 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd"] Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.193719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.193964 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qftx\" (UniqueName: \"kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.194086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.296256 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qftx\" (UniqueName: \"kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.296367 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.296540 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.297328 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.312861 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.315849 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qftx\" (UniqueName: \"kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx\") pod \"collect-profiles-29557845-tj8pd\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.467312 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:00 crc kubenswrapper[4713]: I0314 06:45:00.987527 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd"] Mar 14 06:45:01 crc kubenswrapper[4713]: I0314 06:45:01.113538 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" event={"ID":"9d02daeb-c1bd-498f-addf-e053241eb38b","Type":"ContainerStarted","Data":"f06d947d48fdb87cd43a48d63879d3bc9d4a3b5ee137799735cc7de1410b8fd2"} Mar 14 06:45:01 crc kubenswrapper[4713]: I0314 06:45:01.563757 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:45:01 crc kubenswrapper[4713]: E0314 06:45:01.564090 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:45:02 crc kubenswrapper[4713]: I0314 06:45:02.125220 4713 generic.go:334] "Generic (PLEG): container finished" podID="9d02daeb-c1bd-498f-addf-e053241eb38b" containerID="53f44ae230115728981e08332e4c73b6ac503c0e0e5662e2d82998d56b614265" exitCode=0 Mar 14 06:45:02 crc kubenswrapper[4713]: I0314 06:45:02.125328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" event={"ID":"9d02daeb-c1bd-498f-addf-e053241eb38b","Type":"ContainerDied","Data":"53f44ae230115728981e08332e4c73b6ac503c0e0e5662e2d82998d56b614265"} Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.589487 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.728751 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume\") pod \"9d02daeb-c1bd-498f-addf-e053241eb38b\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.729099 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qftx\" (UniqueName: \"kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx\") pod \"9d02daeb-c1bd-498f-addf-e053241eb38b\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.729348 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume\") pod \"9d02daeb-c1bd-498f-addf-e053241eb38b\" (UID: \"9d02daeb-c1bd-498f-addf-e053241eb38b\") " Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.729579 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume" (OuterVolumeSpecName: "config-volume") pod "9d02daeb-c1bd-498f-addf-e053241eb38b" (UID: "9d02daeb-c1bd-498f-addf-e053241eb38b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.730391 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d02daeb-c1bd-498f-addf-e053241eb38b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.735841 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx" (OuterVolumeSpecName: "kube-api-access-6qftx") pod "9d02daeb-c1bd-498f-addf-e053241eb38b" (UID: "9d02daeb-c1bd-498f-addf-e053241eb38b"). InnerVolumeSpecName "kube-api-access-6qftx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.736669 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9d02daeb-c1bd-498f-addf-e053241eb38b" (UID: "9d02daeb-c1bd-498f-addf-e053241eb38b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.834103 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qftx\" (UniqueName: \"kubernetes.io/projected/9d02daeb-c1bd-498f-addf-e053241eb38b-kube-api-access-6qftx\") on node \"crc\" DevicePath \"\"" Mar 14 06:45:03 crc kubenswrapper[4713]: I0314 06:45:03.834147 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d02daeb-c1bd-498f-addf-e053241eb38b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 06:45:04 crc kubenswrapper[4713]: I0314 06:45:04.154391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" event={"ID":"9d02daeb-c1bd-498f-addf-e053241eb38b","Type":"ContainerDied","Data":"f06d947d48fdb87cd43a48d63879d3bc9d4a3b5ee137799735cc7de1410b8fd2"} Mar 14 06:45:04 crc kubenswrapper[4713]: I0314 06:45:04.154441 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f06d947d48fdb87cd43a48d63879d3bc9d4a3b5ee137799735cc7de1410b8fd2" Mar 14 06:45:04 crc kubenswrapper[4713]: I0314 06:45:04.154448 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557845-tj8pd" Mar 14 06:45:04 crc kubenswrapper[4713]: I0314 06:45:04.678408 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv"] Mar 14 06:45:04 crc kubenswrapper[4713]: I0314 06:45:04.689317 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557800-gn6hv"] Mar 14 06:45:05 crc kubenswrapper[4713]: I0314 06:45:05.577213 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75332f9a-1c54-42cf-8030-525ee3cffca1" path="/var/lib/kubelet/pods/75332f9a-1c54-42cf-8030-525ee3cffca1/volumes" Mar 14 06:45:12 crc kubenswrapper[4713]: I0314 06:45:12.563919 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:45:12 crc kubenswrapper[4713]: E0314 06:45:12.564749 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:45:26 crc kubenswrapper[4713]: I0314 06:45:26.565040 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:45:26 crc kubenswrapper[4713]: E0314 06:45:26.566552 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:45:40 crc kubenswrapper[4713]: I0314 06:45:40.564010 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:45:40 crc kubenswrapper[4713]: E0314 06:45:40.565009 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:45:54 crc kubenswrapper[4713]: I0314 06:45:54.564339 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:45:54 crc kubenswrapper[4713]: E0314 06:45:54.565350 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:45:54 crc kubenswrapper[4713]: I0314 06:45:54.629393 4713 scope.go:117] "RemoveContainer" containerID="9c4f1e8a29a3f0aca04e04e6cb918dcfad20423f2d7a61a9830d5a1ed0b35ad7" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.145555 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557846-jbmqw"] Mar 14 06:46:00 crc kubenswrapper[4713]: E0314 06:46:00.147421 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d02daeb-c1bd-498f-addf-e053241eb38b" containerName="collect-profiles" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.147502 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d02daeb-c1bd-498f-addf-e053241eb38b" containerName="collect-profiles" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.147887 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d02daeb-c1bd-498f-addf-e053241eb38b" containerName="collect-profiles" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.148813 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.152521 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.152667 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.152861 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.156615 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557846-jbmqw"] Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.199172 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"auto-csr-approver-29557846-jbmqw\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.301912 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"auto-csr-approver-29557846-jbmqw\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.616446 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"auto-csr-approver-29557846-jbmqw\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.616750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"auto-csr-approver-29557846-jbmqw\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.617122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"auto-csr-approver-29557846-jbmqw\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:00 crc kubenswrapper[4713]: I0314 06:46:00.769623 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:01 crc kubenswrapper[4713]: I0314 06:46:01.340358 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557846-jbmqw"] Mar 14 06:46:01 crc kubenswrapper[4713]: I0314 06:46:01.361671 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:46:01 crc kubenswrapper[4713]: I0314 06:46:01.809257 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" event={"ID":"6abcaea2-39c2-4f9f-85fe-b51d5e791e17","Type":"ContainerStarted","Data":"ec4a74f5ffbc57c0c85f4100cc3afe52dfb8e406bd2df41f1cc42f3a2072e71c"} Mar 14 06:46:02 crc kubenswrapper[4713]: I0314 06:46:02.821737 4713 generic.go:334] "Generic (PLEG): container finished" podID="6abcaea2-39c2-4f9f-85fe-b51d5e791e17" containerID="e7135d1a6f4bc49cc0e885b5c22866db6a625feb52cf053de5ac05e2c11f025c" exitCode=0 Mar 14 06:46:02 crc kubenswrapper[4713]: I0314 06:46:02.821796 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" event={"ID":"6abcaea2-39c2-4f9f-85fe-b51d5e791e17","Type":"ContainerDied","Data":"e7135d1a6f4bc49cc0e885b5c22866db6a625feb52cf053de5ac05e2c11f025c"} Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.358044 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.465435 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") pod \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\" (UID: \"6abcaea2-39c2-4f9f-85fe-b51d5e791e17\") " Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.484299 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw" (OuterVolumeSpecName: "kube-api-access-6kklw") pod "6abcaea2-39c2-4f9f-85fe-b51d5e791e17" (UID: "6abcaea2-39c2-4f9f-85fe-b51d5e791e17"). InnerVolumeSpecName "kube-api-access-6kklw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.567632 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kklw\" (UniqueName: \"kubernetes.io/projected/6abcaea2-39c2-4f9f-85fe-b51d5e791e17-kube-api-access-6kklw\") on node \"crc\" DevicePath \"\"" Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.844429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" event={"ID":"6abcaea2-39c2-4f9f-85fe-b51d5e791e17","Type":"ContainerDied","Data":"ec4a74f5ffbc57c0c85f4100cc3afe52dfb8e406bd2df41f1cc42f3a2072e71c"} Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.844775 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec4a74f5ffbc57c0c85f4100cc3afe52dfb8e406bd2df41f1cc42f3a2072e71c" Mar 14 06:46:04 crc kubenswrapper[4713]: I0314 06:46:04.844483 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557846-jbmqw" Mar 14 06:46:05 crc kubenswrapper[4713]: I0314 06:46:05.438795 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557840-b7dwd"] Mar 14 06:46:05 crc kubenswrapper[4713]: I0314 06:46:05.451789 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557840-b7dwd"] Mar 14 06:46:05 crc kubenswrapper[4713]: I0314 06:46:05.580364 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2868b35f-86fe-4b44-98d7-8d5a429938f3" path="/var/lib/kubelet/pods/2868b35f-86fe-4b44-98d7-8d5a429938f3/volumes" Mar 14 06:46:06 crc kubenswrapper[4713]: I0314 06:46:06.563705 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:46:06 crc kubenswrapper[4713]: E0314 06:46:06.564387 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:46:19 crc kubenswrapper[4713]: I0314 06:46:19.564745 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:46:19 crc kubenswrapper[4713]: E0314 06:46:19.565730 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:46:30 crc kubenswrapper[4713]: I0314 06:46:30.565390 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:46:30 crc kubenswrapper[4713]: E0314 06:46:30.566729 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.285653 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:46:31 crc kubenswrapper[4713]: E0314 06:46:31.286365 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abcaea2-39c2-4f9f-85fe-b51d5e791e17" containerName="oc" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.286388 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abcaea2-39c2-4f9f-85fe-b51d5e791e17" containerName="oc" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.286689 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6abcaea2-39c2-4f9f-85fe-b51d5e791e17" containerName="oc" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.289265 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.297608 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.436278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sg7\" (UniqueName: \"kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.436337 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.436907 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.539869 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sg7\" (UniqueName: \"kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.539937 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.540051 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.540786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.541100 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.572641 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sg7\" (UniqueName: \"kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7\") pod \"redhat-operators-lb4tq\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:31 crc kubenswrapper[4713]: I0314 06:46:31.619527 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:32 crc kubenswrapper[4713]: I0314 06:46:32.244946 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:46:33 crc kubenswrapper[4713]: I0314 06:46:33.173062 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerID="933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2" exitCode=0 Mar 14 06:46:33 crc kubenswrapper[4713]: I0314 06:46:33.173168 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerDied","Data":"933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2"} Mar 14 06:46:33 crc kubenswrapper[4713]: I0314 06:46:33.173373 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerStarted","Data":"1c67d41342eed07d3c5e3319bc34e1be47cb2cc2348f53cd70bfb68719d298fd"} Mar 14 06:46:35 crc kubenswrapper[4713]: I0314 06:46:35.204628 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerStarted","Data":"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979"} Mar 14 06:46:41 crc kubenswrapper[4713]: I0314 06:46:41.278849 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerID="5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979" exitCode=0 Mar 14 06:46:41 crc kubenswrapper[4713]: I0314 06:46:41.278932 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerDied","Data":"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979"} Mar 14 06:46:42 crc kubenswrapper[4713]: I0314 06:46:42.292811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerStarted","Data":"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749"} Mar 14 06:46:42 crc kubenswrapper[4713]: I0314 06:46:42.333994 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lb4tq" podStartSLOduration=2.8242080659999997 podStartE2EDuration="11.333965658s" podCreationTimestamp="2026-03-14 06:46:31 +0000 UTC" firstStartedPulling="2026-03-14 06:46:33.175351292 +0000 UTC m=+4776.263260592" lastFinishedPulling="2026-03-14 06:46:41.685108884 +0000 UTC m=+4784.773018184" observedRunningTime="2026-03-14 06:46:42.314448874 +0000 UTC m=+4785.402358184" watchObservedRunningTime="2026-03-14 06:46:42.333965658 +0000 UTC m=+4785.421874988" Mar 14 06:46:43 crc kubenswrapper[4713]: I0314 06:46:43.564243 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:46:43 crc kubenswrapper[4713]: E0314 06:46:43.565162 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:46:51 crc kubenswrapper[4713]: I0314 06:46:51.620457 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:51 crc kubenswrapper[4713]: I0314 06:46:51.621008 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:46:52 crc kubenswrapper[4713]: I0314 06:46:52.669356 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lb4tq" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:46:52 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:46:52 crc kubenswrapper[4713]: > Mar 14 06:46:54 crc kubenswrapper[4713]: I0314 06:46:54.706999 4713 scope.go:117] "RemoveContainer" containerID="289dd2588fddaf3371179da8d1bdb7a81c3845268272a86d9abcc9f0cad2fbc9" Mar 14 06:46:56 crc kubenswrapper[4713]: I0314 06:46:56.564346 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:46:56 crc kubenswrapper[4713]: E0314 06:46:56.564960 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:47:02 crc kubenswrapper[4713]: I0314 06:47:02.749499 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lb4tq" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:47:02 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:47:02 crc kubenswrapper[4713]: > Mar 14 06:47:11 crc kubenswrapper[4713]: I0314 06:47:11.564430 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:47:11 crc kubenswrapper[4713]: E0314 06:47:11.565414 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:47:12 crc kubenswrapper[4713]: I0314 06:47:12.671939 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lb4tq" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:47:12 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:47:12 crc kubenswrapper[4713]: > Mar 14 06:47:21 crc kubenswrapper[4713]: I0314 06:47:21.948124 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:47:22 crc kubenswrapper[4713]: I0314 06:47:22.003296 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:47:22 crc kubenswrapper[4713]: I0314 06:47:22.187118 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:47:23 crc kubenswrapper[4713]: I0314 06:47:23.730855 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lb4tq" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" containerID="cri-o://322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749" gracePeriod=2 Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.431455 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.574238 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42sg7\" (UniqueName: \"kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7\") pod \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.574455 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content\") pod \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.574615 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities\") pod \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\" (UID: \"fc22fe30-840a-4e2a-b9ef-ac51129783dd\") " Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.575075 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities" (OuterVolumeSpecName: "utilities") pod "fc22fe30-840a-4e2a-b9ef-ac51129783dd" (UID: "fc22fe30-840a-4e2a-b9ef-ac51129783dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.576861 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.581316 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7" (OuterVolumeSpecName: "kube-api-access-42sg7") pod "fc22fe30-840a-4e2a-b9ef-ac51129783dd" (UID: "fc22fe30-840a-4e2a-b9ef-ac51129783dd"). InnerVolumeSpecName "kube-api-access-42sg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.679444 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42sg7\" (UniqueName: \"kubernetes.io/projected/fc22fe30-840a-4e2a-b9ef-ac51129783dd-kube-api-access-42sg7\") on node \"crc\" DevicePath \"\"" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.701598 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc22fe30-840a-4e2a-b9ef-ac51129783dd" (UID: "fc22fe30-840a-4e2a-b9ef-ac51129783dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.753665 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerID="322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749" exitCode=0 Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.753717 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerDied","Data":"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749"} Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.753749 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb4tq" event={"ID":"fc22fe30-840a-4e2a-b9ef-ac51129783dd","Type":"ContainerDied","Data":"1c67d41342eed07d3c5e3319bc34e1be47cb2cc2348f53cd70bfb68719d298fd"} Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.753768 4713 scope.go:117] "RemoveContainer" containerID="322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.753811 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb4tq" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.790050 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc22fe30-840a-4e2a-b9ef-ac51129783dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.796667 4713 scope.go:117] "RemoveContainer" containerID="5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.805347 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.816760 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lb4tq"] Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.834690 4713 scope.go:117] "RemoveContainer" containerID="933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.886552 4713 scope.go:117] "RemoveContainer" containerID="322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749" Mar 14 06:47:24 crc kubenswrapper[4713]: E0314 06:47:24.887242 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749\": container with ID starting with 322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749 not found: ID does not exist" containerID="322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.887300 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749"} err="failed to get container status \"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749\": rpc error: code = NotFound desc = could not find container \"322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749\": container with ID starting with 322690915f9b49ad97f554676a98de18d40f4f1c763aaab4db91e86d69569749 not found: ID does not exist" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.887334 4713 scope.go:117] "RemoveContainer" containerID="5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979" Mar 14 06:47:24 crc kubenswrapper[4713]: E0314 06:47:24.887993 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979\": container with ID starting with 5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979 not found: ID does not exist" containerID="5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.888051 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979"} err="failed to get container status \"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979\": rpc error: code = NotFound desc = could not find container \"5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979\": container with ID starting with 5e3b8c0ea9f859e9877e623bfc6debcd88ea4e355e6eb384711a26736cb65979 not found: ID does not exist" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.888090 4713 scope.go:117] "RemoveContainer" containerID="933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2" Mar 14 06:47:24 crc kubenswrapper[4713]: E0314 06:47:24.888487 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2\": container with ID starting with 933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2 not found: ID does not exist" containerID="933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2" Mar 14 06:47:24 crc kubenswrapper[4713]: I0314 06:47:24.888530 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2"} err="failed to get container status \"933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2\": rpc error: code = NotFound desc = could not find container \"933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2\": container with ID starting with 933cf93514610375fd25d3a68432f73a9ebb7ff5ee7f0f24f3127b66eb5ebbc2 not found: ID does not exist" Mar 14 06:47:25 crc kubenswrapper[4713]: I0314 06:47:25.577942 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" path="/var/lib/kubelet/pods/fc22fe30-840a-4e2a-b9ef-ac51129783dd/volumes" Mar 14 06:47:26 crc kubenswrapper[4713]: I0314 06:47:26.564854 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:47:26 crc kubenswrapper[4713]: E0314 06:47:26.565463 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:47:38 crc kubenswrapper[4713]: I0314 06:47:38.564860 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:47:38 crc kubenswrapper[4713]: E0314 06:47:38.566061 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:47:49 crc kubenswrapper[4713]: I0314 06:47:49.563989 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:47:49 crc kubenswrapper[4713]: E0314 06:47:49.564832 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.143912 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557848-qssgx"] Mar 14 06:48:00 crc kubenswrapper[4713]: E0314 06:48:00.145871 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="extract-utilities" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.145962 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="extract-utilities" Mar 14 06:48:00 crc kubenswrapper[4713]: E0314 06:48:00.146045 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="extract-content" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.146107 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="extract-content" Mar 14 06:48:00 crc kubenswrapper[4713]: E0314 06:48:00.146173 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.146252 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.146540 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc22fe30-840a-4e2a-b9ef-ac51129783dd" containerName="registry-server" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.147462 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.149694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.150097 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.150273 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.153703 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557848-qssgx"] Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.161420 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4pks\" (UniqueName: \"kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks\") pod \"auto-csr-approver-29557848-qssgx\" (UID: \"93d072e9-8b3c-4ffd-925c-d908ada34f3d\") " pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.263359 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4pks\" (UniqueName: \"kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks\") pod \"auto-csr-approver-29557848-qssgx\" (UID: \"93d072e9-8b3c-4ffd-925c-d908ada34f3d\") " pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.289649 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4pks\" (UniqueName: \"kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks\") pod \"auto-csr-approver-29557848-qssgx\" (UID: \"93d072e9-8b3c-4ffd-925c-d908ada34f3d\") " pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:00 crc kubenswrapper[4713]: I0314 06:48:00.477182 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:01 crc kubenswrapper[4713]: I0314 06:48:01.014506 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557848-qssgx"] Mar 14 06:48:02 crc kubenswrapper[4713]: I0314 06:48:02.568121 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557848-qssgx" event={"ID":"93d072e9-8b3c-4ffd-925c-d908ada34f3d","Type":"ContainerStarted","Data":"a5753109acd7ee839f00a696538f7c03986656d4c087621a7ab604450e74db38"} Mar 14 06:48:03 crc kubenswrapper[4713]: I0314 06:48:03.565157 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:48:03 crc kubenswrapper[4713]: E0314 06:48:03.566460 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:48:03 crc kubenswrapper[4713]: I0314 06:48:03.581862 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557848-qssgx" event={"ID":"93d072e9-8b3c-4ffd-925c-d908ada34f3d","Type":"ContainerStarted","Data":"6c79b52545c45df99b51f2edb6c404c1996b31d9f681258c10adbf6d6dc53c70"} Mar 14 06:48:03 crc kubenswrapper[4713]: I0314 06:48:03.615546 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557848-qssgx" podStartSLOduration=2.619923546 podStartE2EDuration="3.615515606s" podCreationTimestamp="2026-03-14 06:48:00 +0000 UTC" firstStartedPulling="2026-03-14 06:48:01.714675947 +0000 UTC m=+4864.802585237" lastFinishedPulling="2026-03-14 06:48:02.710267997 +0000 UTC m=+4865.798177297" observedRunningTime="2026-03-14 06:48:03.600328638 +0000 UTC m=+4866.688237938" watchObservedRunningTime="2026-03-14 06:48:03.615515606 +0000 UTC m=+4866.703424916" Mar 14 06:48:04 crc kubenswrapper[4713]: I0314 06:48:04.592936 4713 generic.go:334] "Generic (PLEG): container finished" podID="93d072e9-8b3c-4ffd-925c-d908ada34f3d" containerID="6c79b52545c45df99b51f2edb6c404c1996b31d9f681258c10adbf6d6dc53c70" exitCode=0 Mar 14 06:48:04 crc kubenswrapper[4713]: I0314 06:48:04.592992 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557848-qssgx" event={"ID":"93d072e9-8b3c-4ffd-925c-d908ada34f3d","Type":"ContainerDied","Data":"6c79b52545c45df99b51f2edb6c404c1996b31d9f681258c10adbf6d6dc53c70"} Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.017813 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.136540 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4pks\" (UniqueName: \"kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks\") pod \"93d072e9-8b3c-4ffd-925c-d908ada34f3d\" (UID: \"93d072e9-8b3c-4ffd-925c-d908ada34f3d\") " Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.143417 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks" (OuterVolumeSpecName: "kube-api-access-t4pks") pod "93d072e9-8b3c-4ffd-925c-d908ada34f3d" (UID: "93d072e9-8b3c-4ffd-925c-d908ada34f3d"). InnerVolumeSpecName "kube-api-access-t4pks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.239553 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4pks\" (UniqueName: \"kubernetes.io/projected/93d072e9-8b3c-4ffd-925c-d908ada34f3d-kube-api-access-t4pks\") on node \"crc\" DevicePath \"\"" Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.617296 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557848-qssgx" event={"ID":"93d072e9-8b3c-4ffd-925c-d908ada34f3d","Type":"ContainerDied","Data":"a5753109acd7ee839f00a696538f7c03986656d4c087621a7ab604450e74db38"} Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.617625 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5753109acd7ee839f00a696538f7c03986656d4c087621a7ab604450e74db38" Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.617379 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557848-qssgx" Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.680845 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557842-5n4fj"] Mar 14 06:48:06 crc kubenswrapper[4713]: I0314 06:48:06.696534 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557842-5n4fj"] Mar 14 06:48:07 crc kubenswrapper[4713]: I0314 06:48:07.576959 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe7c302-ae6e-4f31-8db8-65e062d4d9eb" path="/var/lib/kubelet/pods/dfe7c302-ae6e-4f31-8db8-65e062d4d9eb/volumes" Mar 14 06:48:14 crc kubenswrapper[4713]: I0314 06:48:14.563423 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:48:14 crc kubenswrapper[4713]: E0314 06:48:14.564164 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.785813 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 06:48:18 crc kubenswrapper[4713]: E0314 06:48:18.787167 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d072e9-8b3c-4ffd-925c-d908ada34f3d" containerName="oc" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.787194 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d072e9-8b3c-4ffd-925c-d908ada34f3d" containerName="oc" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.787710 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d072e9-8b3c-4ffd-925c-d908ada34f3d" containerName="oc" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.789439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.792309 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.792368 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.792373 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.792450 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qcv76" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.803072 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865106 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865167 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865199 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865285 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865321 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865375 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865464 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndjs\" (UniqueName: \"kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865566 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.865638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968564 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968708 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968738 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968758 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968782 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968831 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.968872 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndjs\" (UniqueName: \"kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.969338 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.969946 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.970020 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.970227 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.972615 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.976442 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.976659 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.985609 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndjs\" (UniqueName: \"kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:18 crc kubenswrapper[4713]: I0314 06:48:18.986912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:19 crc kubenswrapper[4713]: I0314 06:48:19.113837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " pod="openstack/tempest-tests-tempest" Mar 14 06:48:19 crc kubenswrapper[4713]: I0314 06:48:19.409435 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 06:48:19 crc kubenswrapper[4713]: I0314 06:48:19.899508 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 06:48:20 crc kubenswrapper[4713]: I0314 06:48:20.768141 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1","Type":"ContainerStarted","Data":"7d5dc183b321eec63770f1de1041061158c3593d87bb3dca8b2b017e0bdd6342"} Mar 14 06:48:29 crc kubenswrapper[4713]: I0314 06:48:29.565727 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:48:29 crc kubenswrapper[4713]: E0314 06:48:29.566704 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:48:41 crc kubenswrapper[4713]: I0314 06:48:41.564002 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:48:54 crc kubenswrapper[4713]: E0314 06:48:54.271643 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 14 06:48:54 crc kubenswrapper[4713]: E0314 06:48:54.278353 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mndjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(0236ca7c-fd1b-42f0-805c-8d53e34a3cc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 06:48:54 crc kubenswrapper[4713]: E0314 06:48:54.279621 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" Mar 14 06:48:54 crc kubenswrapper[4713]: E0314 06:48:54.394053 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" Mar 14 06:48:54 crc kubenswrapper[4713]: I0314 06:48:54.905768 4713 scope.go:117] "RemoveContainer" containerID="f752b2648efd79bf5a8e96ba56bb6dd8596adfd12674ced1a6008c6a21d0c1f6" Mar 14 06:48:55 crc kubenswrapper[4713]: I0314 06:48:55.402308 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e"} Mar 14 06:49:08 crc kubenswrapper[4713]: I0314 06:49:08.946446 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 06:49:19 crc kubenswrapper[4713]: I0314 06:49:19.639496 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1","Type":"ContainerStarted","Data":"61b652ab071513c05275ebbf158c68291d61e3a0f6f2d0b3998f967f2f12ca4b"} Mar 14 06:49:19 crc kubenswrapper[4713]: I0314 06:49:19.669057 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=13.636576215 podStartE2EDuration="1m2.669035793s" podCreationTimestamp="2026-03-14 06:48:17 +0000 UTC" firstStartedPulling="2026-03-14 06:48:19.911566122 +0000 UTC m=+4882.999475422" lastFinishedPulling="2026-03-14 06:49:08.9440257 +0000 UTC m=+4932.031935000" observedRunningTime="2026-03-14 06:49:19.656890721 +0000 UTC m=+4942.744800021" watchObservedRunningTime="2026-03-14 06:49:19.669035793 +0000 UTC m=+4942.756945093" Mar 14 06:49:55 crc kubenswrapper[4713]: I0314 06:49:55.235173 4713 scope.go:117] "RemoveContainer" containerID="088dc37edd22c006c9f5e352e4d57bdb596a38c1ccdd50304c18bb0a28a136dc" Mar 14 06:49:55 crc kubenswrapper[4713]: I0314 06:49:55.288833 4713 scope.go:117] "RemoveContainer" containerID="d93879c8c4bcc0cb8c382f41eaf00a2e573ed207c8002afeaea1259f65d42f0b" Mar 14 06:49:55 crc kubenswrapper[4713]: I0314 06:49:55.487460 4713 scope.go:117] "RemoveContainer" containerID="24843ca49d99913b86402fbaf11e6c646164d1fe1d262a92a626f411bde9effa" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.178967 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557850-lvfxz"] Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.181451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.184228 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.184338 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.188391 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.240749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557850-lvfxz"] Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.270882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-277tr\" (UniqueName: \"kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr\") pod \"auto-csr-approver-29557850-lvfxz\" (UID: \"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc\") " pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.374084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-277tr\" (UniqueName: \"kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr\") pod \"auto-csr-approver-29557850-lvfxz\" (UID: \"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc\") " pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.398774 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-277tr\" (UniqueName: \"kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr\") pod \"auto-csr-approver-29557850-lvfxz\" (UID: \"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc\") " pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:00 crc kubenswrapper[4713]: I0314 06:50:00.504865 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:01 crc kubenswrapper[4713]: I0314 06:50:01.361642 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557850-lvfxz"] Mar 14 06:50:02 crc kubenswrapper[4713]: I0314 06:50:02.094175 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" event={"ID":"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc","Type":"ContainerStarted","Data":"9a6e367c197092d7c878ba67d150d134b9e3e812407150bfb382a0f3d2abbc05"} Mar 14 06:50:04 crc kubenswrapper[4713]: I0314 06:50:04.162784 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" event={"ID":"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc","Type":"ContainerStarted","Data":"d71104223dbf9074ffb3c1a436be1abfd56de7afa932f922155cf1bee5bb298b"} Mar 14 06:50:04 crc kubenswrapper[4713]: I0314 06:50:04.197651 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" podStartSLOduration=3.073774245 podStartE2EDuration="4.197627371s" podCreationTimestamp="2026-03-14 06:50:00 +0000 UTC" firstStartedPulling="2026-03-14 06:50:01.382398526 +0000 UTC m=+4984.470307826" lastFinishedPulling="2026-03-14 06:50:02.506251652 +0000 UTC m=+4985.594160952" observedRunningTime="2026-03-14 06:50:04.191122387 +0000 UTC m=+4987.279031697" watchObservedRunningTime="2026-03-14 06:50:04.197627371 +0000 UTC m=+4987.285536681" Mar 14 06:50:06 crc kubenswrapper[4713]: I0314 06:50:06.184767 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" event={"ID":"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc","Type":"ContainerDied","Data":"d71104223dbf9074ffb3c1a436be1abfd56de7afa932f922155cf1bee5bb298b"} Mar 14 06:50:06 crc kubenswrapper[4713]: I0314 06:50:06.187550 4713 generic.go:334] "Generic (PLEG): container finished" podID="1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" containerID="d71104223dbf9074ffb3c1a436be1abfd56de7afa932f922155cf1bee5bb298b" exitCode=0 Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.121039 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.195804 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-277tr\" (UniqueName: \"kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr\") pod \"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc\" (UID: \"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc\") " Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.212684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" event={"ID":"1f54224a-a1ed-4d4d-a9e1-d0d714a114fc","Type":"ContainerDied","Data":"9a6e367c197092d7c878ba67d150d134b9e3e812407150bfb382a0f3d2abbc05"} Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.214429 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557850-lvfxz" Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.216373 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6e367c197092d7c878ba67d150d134b9e3e812407150bfb382a0f3d2abbc05" Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.234298 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr" (OuterVolumeSpecName: "kube-api-access-277tr") pod "1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" (UID: "1f54224a-a1ed-4d4d-a9e1-d0d714a114fc"). InnerVolumeSpecName "kube-api-access-277tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.300091 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-277tr\" (UniqueName: \"kubernetes.io/projected/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc-kube-api-access-277tr\") on node \"crc\" DevicePath \"\"" Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.398708 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557844-mgbbv"] Mar 14 06:50:08 crc kubenswrapper[4713]: I0314 06:50:08.414474 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557844-mgbbv"] Mar 14 06:50:09 crc kubenswrapper[4713]: I0314 06:50:09.585576 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d845ea-57e1-4693-a9c7-e202ce5ef771" path="/var/lib/kubelet/pods/78d845ea-57e1-4693-a9c7-e202ce5ef771/volumes" Mar 14 06:50:55 crc kubenswrapper[4713]: I0314 06:50:55.653398 4713 scope.go:117] "RemoveContainer" containerID="d008982ab67c6683ce14724277ac53d61d77e65e74d4eafd020c7f1995222ed4" Mar 14 06:51:10 crc kubenswrapper[4713]: I0314 06:51:10.751754 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:51:10 crc kubenswrapper[4713]: I0314 06:51:10.772596 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:51:41 crc kubenswrapper[4713]: I0314 06:51:41.599930 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:51:41 crc kubenswrapper[4713]: I0314 06:51:41.615900 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:51:47 crc kubenswrapper[4713]: I0314 06:51:47.906479 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:51:47 crc kubenswrapper[4713]: E0314 06:51:47.915598 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" containerName="oc" Mar 14 06:51:47 crc kubenswrapper[4713]: I0314 06:51:47.915725 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" containerName="oc" Mar 14 06:51:47 crc kubenswrapper[4713]: I0314 06:51:47.918785 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" containerName="oc" Mar 14 06:51:47 crc kubenswrapper[4713]: I0314 06:51:47.931327 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.112462 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzd5z\" (UniqueName: \"kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.112878 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.113161 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.216864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.220541 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzd5z\" (UniqueName: \"kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.220805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.224378 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.224769 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.308704 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzd5z\" (UniqueName: \"kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z\") pod \"community-operators-ncc7b\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.354470 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:51:48 crc kubenswrapper[4713]: I0314 06:51:48.413465 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:51:50 crc kubenswrapper[4713]: I0314 06:51:50.091175 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:51:50 crc kubenswrapper[4713]: I0314 06:51:50.731262 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerDied","Data":"aa3e8277457b69aefdd071a90bdc353a450a5b3f3e2bf2c0aae1c8781f25ce2a"} Mar 14 06:51:50 crc kubenswrapper[4713]: I0314 06:51:50.732238 4713 generic.go:334] "Generic (PLEG): container finished" podID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerID="aa3e8277457b69aefdd071a90bdc353a450a5b3f3e2bf2c0aae1c8781f25ce2a" exitCode=0 Mar 14 06:51:50 crc kubenswrapper[4713]: I0314 06:51:50.733130 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerStarted","Data":"a455efc45ec47abd843d1ab224feb87877299f3f9925e8f40ca7a3c666852c5d"} Mar 14 06:51:50 crc kubenswrapper[4713]: I0314 06:51:50.743180 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.327513 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.586850 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.587382 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.587525 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.587586 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.631369 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.758066 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerStarted","Data":"1c36a07fee5bbbd358c73c10d9c2f80f6405cf7c8a3a9aca6bc96f355d1febd7"} Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.924838 4713 patch_prober.go:28] interesting pod/console-79df5895fd-4nxm5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:52 crc kubenswrapper[4713]: I0314 06:51:52.925195 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-79df5895fd-4nxm5" podUID="635fd3d7-4984-4dae-9416-068a4d020d75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.097437 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-75b7bc4c47-ltr87" podUID="4101fac4-706c-4e2b-9203-102d0874c3ba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.138474 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.245372 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.678540 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.678591 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.678829 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podUID="cbc588fa-b052-4336-81fe-2fed809e251b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.678977 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podUID="cbc588fa-b052-4336-81fe-2fed809e251b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.885481 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.885500 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.885780 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:53 crc kubenswrapper[4713]: I0314 06:51:53.885783 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.208407 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.208481 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.505378 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.505422 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.505437 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.505404 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.505396 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.978434 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podUID="129ebe3f-95aa-42f1-8f56-1d3120fb5419" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:54 crc kubenswrapper[4713]: I0314 06:51:54.979062 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podUID="129ebe3f-95aa-42f1-8f56-1d3120fb5419" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.043054 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.043124 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.043378 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.043457 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.315417 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" podUID="6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.476420 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output=< Mar 14 06:51:55 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:51:55 crc kubenswrapper[4713]: > Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.476856 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output=< Mar 14 06:51:55 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:51:55 crc kubenswrapper[4713]: > Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.990458 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:55 crc kubenswrapper[4713]: I0314 06:51:55.990655 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:56 crc kubenswrapper[4713]: I0314 06:51:56.817615 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 14 06:51:56 crc kubenswrapper[4713]: I0314 06:51:56.936607 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:51:56 crc kubenswrapper[4713]: I0314 06:51:56.998478 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.083729 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.083792 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.083734 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.083840 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.263198 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn7q4\" (UniqueName: \"kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.263773 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.263802 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.366365 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn7q4\" (UniqueName: \"kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.366616 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.366655 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.469578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.513519 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.662784 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn7q4\" (UniqueName: \"kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4\") pod \"certified-operators-9tzcq\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.707249 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.707310 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.716741 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.716804 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.806643 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.807793 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.846371 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.846422 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.846371 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.846556 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:57 crc kubenswrapper[4713]: I0314 06:51:57.863456 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:51:58 crc kubenswrapper[4713]: I0314 06:51:58.845404 4713 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-25r25 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:58 crc kubenswrapper[4713]: I0314 06:51:58.846198 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" podUID="a3c3dff8-a2ea-4073-a6ca-c391aaf296d0" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:58 crc kubenswrapper[4713]: I0314 06:51:58.900271 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:51:59 crc kubenswrapper[4713]: I0314 06:51:59.695830 4713 patch_prober.go:28] interesting pod/metrics-server-6d5f446985-q8pw2 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:59 crc kubenswrapper[4713]: I0314 06:51:59.696155 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:51:59 crc kubenswrapper[4713]: I0314 06:51:59.695892 4713 patch_prober.go:28] interesting pod/metrics-server-6d5f446985-q8pw2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:51:59 crc kubenswrapper[4713]: I0314 06:51:59.696313 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:00 crc kubenswrapper[4713]: I0314 06:52:00.040466 4713 patch_prober.go:28] interesting pod/monitoring-plugin-84469c67d6-74jtt container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:00 crc kubenswrapper[4713]: I0314 06:52:00.040540 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" podUID="88a15bde-288a-4e1f-b537-7127832ecb65" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.326980 4713 patch_prober.go:28] interesting pod/thanos-querier-596654c596-mpwzl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.327427 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" podUID="aee49a16-349d-4656-a0d0-c78cb70ca08f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.366659 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.366744 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.366749 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.366853 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.562575 4713 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-ztqtl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.562633 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" podUID="d87e3f74-fd38-4b24-b489-cf054f0e8375" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.581316 4713 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-h7xw6 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:01 crc kubenswrapper[4713]: I0314 06:52:01.581444 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podUID="82a7870b-ac91-41f9-a94f-41db191e711b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.195463 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.195468 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341389 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341418 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341451 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341477 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341481 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341510 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341547 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341562 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341589 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341602 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341636 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.341649 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.508442 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podUID="50d43641-0638-4763-9123-0c0c2c76629e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.508662 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podUID="d5c6be47-5c06-46e0-ae8c-87b7a3f23561" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.508715 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podUID="50d43641-0638-4763-9123-0c0c2c76629e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.508726 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podUID="d5c6be47-5c06-46e0-ae8c-87b7a3f23561" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667464 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667530 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667581 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667590 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667599 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podUID="aa4ff369-f2af-439f-b9f6-2c8301e80210" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.667813 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podUID="aa4ff369-f2af-439f-b9f6-2c8301e80210" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.707053 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.707128 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.716147 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.716281 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.965388 4713 patch_prober.go:28] interesting pod/console-79df5895fd-4nxm5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:02 crc kubenswrapper[4713]: I0314 06:52:02.965464 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-79df5895fd-4nxm5" podUID="635fd3d7-4984-4dae-9416-068a4d020d75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.180434 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.180748 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.679567 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.679567 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.843408 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:03 crc kubenswrapper[4713]: I0314 06:52:03.843482 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.087977 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:04 crc kubenswrapper[4713]: timeout: health rpc did not complete within 1s Mar 14 06:52:04 crc kubenswrapper[4713]: > Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.087973 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:04 crc kubenswrapper[4713]: timeout: health rpc did not complete within 1s Mar 14 06:52:04 crc kubenswrapper[4713]: > Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.091634 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:04 crc kubenswrapper[4713]: timeout: health rpc did not complete within 1s Mar 14 06:52:04 crc kubenswrapper[4713]: > Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.092633 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:04 crc kubenswrapper[4713]: timeout: health rpc did not complete within 1s Mar 14 06:52:04 crc kubenswrapper[4713]: > Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.167433 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.380635 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.509664 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.509707 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.509663 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.509801 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:04 crc kubenswrapper[4713]: I0314 06:52:04.761429 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" podUID="92164fd9-b08c-4b00-975c-0fcdd245f8f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:05 crc kubenswrapper[4713]: I0314 06:52:05.042782 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:05 crc kubenswrapper[4713]: I0314 06:52:05.042861 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:05 crc kubenswrapper[4713]: I0314 06:52:05.042785 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:05 crc kubenswrapper[4713]: I0314 06:52:05.043007 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:05 crc kubenswrapper[4713]: I0314 06:52:05.315413 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" podUID="6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:06 crc kubenswrapper[4713]: I0314 06:52:06.219002 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:06 crc kubenswrapper[4713]: > Mar 14 06:52:06 crc kubenswrapper[4713]: I0314 06:52:06.219306 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:06 crc kubenswrapper[4713]: > Mar 14 06:52:06 crc kubenswrapper[4713]: I0314 06:52:06.811314 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.083512 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.083568 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.083995 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.084041 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.502355 4713 patch_prober.go:28] interesting pod/loki-operator-controller-manager-66945dfc9f-xqf5p container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.502362 4713 patch_prober.go:28] interesting pod/loki-operator-controller-manager-66945dfc9f-xqf5p container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.502414 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" podUID="aec4bfff-0bef-401b-9db6-f9046825614a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.502443 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-66945dfc9f-xqf5p" podUID="aec4bfff-0bef-401b-9db6-f9046825614a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.707199 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.707581 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.708295 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.708329 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.716704 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.716868 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.716995 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.717092 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.757862 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.757792 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.757922 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.757951 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.806482 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.807659 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.809770 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.809900 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.849404 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.849479 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.849418 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:07 crc kubenswrapper[4713]: I0314 06:52:07.849537 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.135818 4713 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.135893 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.305435 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.305477 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.305505 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.305538 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.802142 4713 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-25r25 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:08 crc kubenswrapper[4713]: I0314 06:52:08.802197 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" podUID="a3c3dff8-a2ea-4073-a6ca-c391aaf296d0" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.369713 4713 trace.go:236] Trace[938082187]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (14-Mar-2026 06:52:05.112) (total time: 4255ms): Mar 14 06:52:09 crc kubenswrapper[4713]: Trace[938082187]: [4.255666431s] [4.255666431s] END Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.601376 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bzczw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.601446 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" podUID="2c50b2f7-7be4-4125-94ac-525d908a9e86" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.601501 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bzczw container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.601607 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-bzczw" podUID="2c50b2f7-7be4-4125-94ac-525d908a9e86" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.696024 4713 patch_prober.go:28] interesting pod/metrics-server-6d5f446985-q8pw2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.696101 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.808314 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.808903 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.831425 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557852-7mmfv"] Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.871820 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.932194 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hqg\" (UniqueName: \"kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg\") pod \"auto-csr-approver-29557852-7mmfv\" (UID: \"97afee50-131f-4e19-a172-0020e3607abc\") " pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.981020 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:52:09 crc kubenswrapper[4713]: I0314 06:52:09.981033 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.034371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hqg\" (UniqueName: \"kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg\") pod \"auto-csr-approver-29557852-7mmfv\" (UID: \"97afee50-131f-4e19-a172-0020e3607abc\") " pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.040883 4713 patch_prober.go:28] interesting pod/monitoring-plugin-84469c67d6-74jtt container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.040970 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" podUID="88a15bde-288a-4e1f-b537-7127832ecb65" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.171414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.598626 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hqg\" (UniqueName: \"kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg\") pod \"auto-csr-approver-29557852-7mmfv\" (UID: \"97afee50-131f-4e19-a172-0020e3607abc\") " pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.731417 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.731475 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.732163 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.736957 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.738732 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e" gracePeriod=600 Mar 14 06:52:10 crc kubenswrapper[4713]: I0314 06:52:10.830473 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.314312 4713 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.314668 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.325375 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.325398 4713 patch_prober.go:28] interesting pod/thanos-querier-596654c596-mpwzl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.325490 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.325573 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" podUID="aee49a16-349d-4656-a0d0-c78cb70ca08f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.448417 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.448486 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.449059 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.449108 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.449290 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.449312 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.522821 4713 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-ztqtl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.523594 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-ztqtl" podUID="d87e3f74-fd38-4b24-b489-cf054f0e8375" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.582349 4713 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-h7xw6 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.582408 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podUID="82a7870b-ac91-41f9-a94f-41db191e711b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.696473 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podUID="12eb62d0-8721-4482-b4a3-148a61cea029" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.737410 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" podUID="fa62dff3-1643-4e94-b31a-d56b21a2327d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.737439 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" podUID="4128f2c6-d929-4815-8502-291baf22f24f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.747080 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-zz92h" podUID="bae008e7-4329-4d30-9820-81daf4300f96" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.811490 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 14 06:52:11 crc kubenswrapper[4713]: I0314 06:52:11.811583 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.004857 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e" exitCode=0 Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.004919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e"} Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.005309 4713 scope.go:117] "RemoveContainer" containerID="ea51ad9618ee4448cd436f9a3bdb0383242147f8726843397f68751c09008d60" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.173083 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.215376 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podUID="46274028-feea-4c48-b086-44533fc3e996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.257666 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-46gkk" podUID="149dd450-69f3-4d71-aac3-90052dcf2253" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344429 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344429 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344514 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344530 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344574 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qkdqn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344611 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkdqn" podUID="ed4a1500-6481-4d26-a107-f76299623688" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344704 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344719 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344732 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.344741 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.428620 4713 patch_prober.go:28] interesting pod/router-default-5444994796-s276w container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.428647 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7t4g8" podUID="d5c6be47-5c06-46e0-ae8c-87b7a3f23561" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.428740 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-s276w" podUID="d626c9fa-84ff-40c0-ae90-c477a699591a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.469378 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podUID="50d43641-0638-4763-9123-0c0c2c76629e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.469397 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.478393 4713 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-rbp8f container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.478442 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" podUID="ed72a9eb-a4ee-430c-9449-566f2c56c3bf" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.555727 4713 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5d4kx container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.556043 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" podUID="1d1f7414-07c8-48ab-bc8b-3892473aa10f" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.582057 4713 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-h7xw6 container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.582125 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podUID="82a7870b-ac91-41f9-a94f-41db191e711b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626389 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626439 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626419 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podUID="aa4ff369-f2af-439f-b9f6-2c8301e80210" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626476 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626474 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.626553 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.627578 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.628919 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.628955 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" containerID="cri-o://682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c" gracePeriod=30 Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.667503 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753468 4713 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-w65zj container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753542 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753583 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753539 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" podUID="0dbd3bea-9644-4bc5-96c7-822b26810706" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753621 4713 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-w65zj container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753636 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" podUID="0dbd3bea-9644-4bc5-96c7-822b26810706" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753604 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753584 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753685 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753746 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753768 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.753793 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.808810 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.810270 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.924170 4713 patch_prober.go:28] interesting pod/console-79df5895fd-4nxm5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.924235 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-79df5895fd-4nxm5" podUID="635fd3d7-4984-4dae-9416-068a4d020d75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:12 crc kubenswrapper[4713]: I0314 06:52:12.924308 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.025757 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4"} Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.138386 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.138498 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.250510 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.322449 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84b689b795-q7lfp" podUID="6c7267d1-1d86-4ea3-91c6-5edc53bdfe01" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.478238 4713 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-rbp8f container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.478299 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" podUID="ed72a9eb-a4ee-430c-9449-566f2c56c3bf" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.493525 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.493581 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.555360 4713 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5d4kx container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.555441 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" podUID="1d1f7414-07c8-48ab-bc8b-3892473aa10f" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.615164 4713 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.615248 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="fd0e6ea3-0887-4eba-a83d-f76a405b0d56" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.689796 4713 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.689871 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="ac2f3622-77e0-46da-95dc-1a17548790a7" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.760561 4713 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-wnccn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.760620 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.760627 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.760710 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.760572 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.761044 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.776674 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"24315edf7f9c7831495988b0093625f781a9e53320b097ca2701ae04d824b0de"} pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.776767 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" containerID="cri-o://24315edf7f9c7831495988b0093625f781a9e53320b097ca2701ae04d824b0de" gracePeriod=2 Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843387 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" podUID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843437 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843505 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843503 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843555 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843576 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843611 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843624 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843642 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843661 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843706 4713 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-wnccn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843721 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.844002 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podUID="cbc588fa-b052-4336-81fe-2fed809e251b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.843422 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podUID="cbc588fa-b052-4336-81fe-2fed809e251b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925407 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925449 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925733 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925811 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925794 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925540 4713 patch_prober.go:28] interesting pod/console-79df5895fd-4nxm5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.143:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:13 crc kubenswrapper[4713]: I0314 06:52:13.925882 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-79df5895fd-4nxm5" podUID="635fd3d7-4984-4dae-9416-068a4d020d75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.143:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.251505 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.251567 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.251625 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.254514 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.494390 4713 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.494883 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="f04ba68c-50bf-406f-977f-7cf9b7d1f4b4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.517420 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.517511 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518045 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518067 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518099 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518169 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518169 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518222 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518236 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518227 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518290 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.518483 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.519811 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"aaf5cad0707989fc94a29b45cd275755b022b2382c7d94e7f417872e3f270f54"} pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.519853 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" podUID="4353213b-b89f-4288-babb-7afef0ca216a" containerName="frr-k8s-webhook-server" containerID="cri-o://aaf5cad0707989fc94a29b45cd275755b022b2382c7d94e7f417872e3f270f54" gracePeriod=10 Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.523842 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"450c6e52ec6a5edfbb0bb0de38472b0a24f097ed56db00c6e347debcd90953b9"} pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.523904 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" containerID="cri-o://450c6e52ec6a5edfbb0bb0de38472b0a24f097ed56db00c6e347debcd90953b9" gracePeriod=30 Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602492 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-knjfw" podUID="9640b5fe-f2ba-4a12-b456-1643ddc063f2" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602495 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602615 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602660 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602681 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.602907 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-knjfw" podUID="9640b5fe-f2ba-4a12-b456-1643ddc063f2" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.614707 4713 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.614767 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="fd0e6ea3-0887-4eba-a83d-f76a405b0d56" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.689380 4713 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.689460 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="ac2f3622-77e0-46da-95dc-1a17548790a7" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.802462 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" podUID="92164fd9-b08c-4b00-975c-0fcdd245f8f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.802445 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-85d9999fbb-kdnkw" podUID="92164fd9-b08c-4b00-975c-0fcdd245f8f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:14 crc kubenswrapper[4713]: I0314 06:52:14.978432 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podUID="129ebe3f-95aa-42f1-8f56-1d3120fb5419" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.019507 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podUID="129ebe3f-95aa-42f1-8f56-1d3120fb5419" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.019513 4713 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-hqw7p container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.019628 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" podUID="f9178880-ef43-43c5-8e91-f4c46d4aa0c6" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043122 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043149 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043221 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043318 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043304 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.043389 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.044922 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"ef5af81019e2cc410dddb8817db8a366260075d08aa35fcae4eb5912ea7b56a4"} pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.051934 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"fa9c779b58a19a8105e4e00fbd558ac3b5fad3f2ebff0bcc9dd6b29f92f3d5d5"} pod="metallb-system/frr-k8s-64h8p" containerMessage="Container controller failed liveness probe, will be restarted" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.052006 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"ae8f3f497a7e1f2784f8929d857ad944545fb9d4a5baaa285c02257b2f6619ad"} pod="metallb-system/frr-k8s-64h8p" containerMessage="Container frr failed liveness probe, will be restarted" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.052146 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" containerID="cri-o://fa9c779b58a19a8105e4e00fbd558ac3b5fad3f2ebff0bcc9dd6b29f92f3d5d5" gracePeriod=2 Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.334391 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" podUID="4da1ed21-82a5-400c-a201-653fe58adf4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.334782 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-xdfql" podUID="6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.561426 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.822449 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.822510 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.822547 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.823116 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.990474 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.990504 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.990618 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="5e00395b-5b37-4ba4-a4e7-7ad08388b053" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:15 crc kubenswrapper[4713]: I0314 06:52:15.990651 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5e00395b-5b37-4ba4-a4e7-7ad08388b053" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.044452 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.044544 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.305083 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.305138 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.325904 4713 patch_prober.go:28] interesting pod/thanos-querier-596654c596-mpwzl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.325949 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-596654c596-mpwzl" podUID="aee49a16-349d-4656-a0d0-c78cb70ca08f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:16 crc kubenswrapper[4713]: E0314 06:52:16.566781 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00783a2_42c7_45b5_b83d_136c314b0086.slice/crio-682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c.scope\": RecentStats: unable to find data in memory cache]" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.603396 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.817690 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.817776 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.833364 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"bd0395a1990f66cb6d99e088c548f943054f350eec3636be7b07d8ea6d5e8ac0"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 14 06:52:16 crc kubenswrapper[4713]: I0314 06:52:16.833489 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerName="ceilometer-central-agent" containerID="cri-o://bd0395a1990f66cb6d99e088c548f943054f350eec3636be7b07d8ea6d5e8ac0" gracePeriod=30 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.083727 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.083764 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.083800 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.083821 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.083854 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.099401 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.108863 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"d75b4b698387d0513da2b6b82c82fe51ae18108798499ede476fbf54d04cd679"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.108942 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" containerID="cri-o://d75b4b698387d0513da2b6b82c82fe51ae18108798499ede476fbf54d04cd679" gracePeriod=30 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.203645 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" event={"ID":"8440ca7f-e5cd-4deb-9e52-8be733b65583","Type":"ContainerDied","Data":"24315edf7f9c7831495988b0093625f781a9e53320b097ca2701ae04d824b0de"} Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.223838 4713 generic.go:334] "Generic (PLEG): container finished" podID="8440ca7f-e5cd-4deb-9e52-8be733b65583" containerID="24315edf7f9c7831495988b0093625f781a9e53320b097ca2701ae04d824b0de" exitCode=137 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.227480 4713 generic.go:334] "Generic (PLEG): container finished" podID="b00783a2-42c7-45b5-b83d-136c314b0086" containerID="682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c" exitCode=0 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.227527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" event={"ID":"b00783a2-42c7-45b5-b83d-136c314b0086","Type":"ContainerDied","Data":"682685c348eba1a5440a7b263039edfb02505d518bcfe044f565cb9553b13b1c"} Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.290734 4713 trace.go:236] Trace[1603609790]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (14-Mar-2026 06:52:15.130) (total time: 2155ms): Mar 14 06:52:17 crc kubenswrapper[4713]: Trace[1603609790]: [2.155364668s] [2.155364668s] END Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.290758 4713 trace.go:236] Trace[435003165]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (14-Mar-2026 06:52:16.193) (total time: 1093ms): Mar 14 06:52:17 crc kubenswrapper[4713]: Trace[435003165]: [1.093019325s] [1.093019325s] END Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.290738 4713 trace.go:236] Trace[579955954]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (14-Mar-2026 06:52:12.700) (total time: 4585ms): Mar 14 06:52:17 crc kubenswrapper[4713]: Trace[579955954]: [4.585849559s] [4.585849559s] END Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.290741 4713 trace.go:236] Trace[1918480728]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-42m74" (14-Mar-2026 06:52:14.481) (total time: 2804ms): Mar 14 06:52:17 crc kubenswrapper[4713]: Trace[1918480728]: [2.804787708s] [2.804787708s] END Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.571374 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-vmvfw" podUID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:17 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:17 crc kubenswrapper[4713]: > Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.571381 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-vmvfw" podUID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:17 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:17 crc kubenswrapper[4713]: > Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.571535 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-prgds" podUID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:17 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:17 crc kubenswrapper[4713]: > Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.573522 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-prgds" podUID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:17 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:17 crc kubenswrapper[4713]: > Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.707992 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.708055 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.709496 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.709656 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.716337 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.716370 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.716406 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.716471 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.757101 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.757279 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.757457 4713 patch_prober.go:28] interesting pod/controller-manager-56cb9c466-g7c95 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.757487 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-56cb9c466-g7c95" podUID="cfebbac0-ce4d-43c1-b872-293d64e8256b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.764689 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.764745 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.764760 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.764823 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.764972 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.778918 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"01f3d0fa33a108eb57c269fe7ac1b047e66c75ed71d13bcce3f53e8a0e985294"} pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.779000 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" containerID="cri-o://01f3d0fa33a108eb57c269fe7ac1b047e66c75ed71d13bcce3f53e8a0e985294" gracePeriod=30 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.806744 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.806926 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.807018 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.807093 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.807166 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.808294 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"14748d02753ab502f9ae5a5c30c424c7a250193acb463caf9edf740cde85c571"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.808989 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.809098 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.811424 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.811471 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.811555 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output="command timed out" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.815520 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.852563 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766"} pod="openstack-operators/openstack-operator-index-44kkb" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.852637 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" containerID="cri-o://1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766" gracePeriod=30 Mar 14 06:52:17 crc kubenswrapper[4713]: I0314 06:52:17.988656 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="frr" containerID="cri-o://ae8f3f497a7e1f2784f8929d857ad944545fb9d4a5baaa285c02257b2f6619ad" gracePeriod=2 Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.099898 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.099973 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.136363 4713 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.136430 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.272933 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerID="fa9c779b58a19a8105e4e00fbd558ac3b5fad3f2ebff0bcc9dd6b29f92f3d5d5" exitCode=0 Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.273039 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerDied","Data":"fa9c779b58a19a8105e4e00fbd558ac3b5fad3f2ebff0bcc9dd6b29f92f3d5d5"} Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.275589 4713 generic.go:334] "Generic (PLEG): container finished" podID="4353213b-b89f-4288-babb-7afef0ca216a" containerID="aaf5cad0707989fc94a29b45cd275755b022b2382c7d94e7f417872e3f270f54" exitCode=0 Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.275624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" event={"ID":"4353213b-b89f-4288-babb-7afef0ca216a","Type":"ContainerDied","Data":"aaf5cad0707989fc94a29b45cd275755b022b2382c7d94e7f417872e3f270f54"} Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.803578 4713 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-25r25 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.803654 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" podUID="a3c3dff8-a2ea-4073-a6ca-c391aaf296d0" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.91:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.803747 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 06:52:18 crc kubenswrapper[4713]: I0314 06:52:18.810786 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.295686 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" event={"ID":"b00783a2-42c7-45b5-b83d-136c314b0086","Type":"ContainerStarted","Data":"8d24f0f9595dfed762c191798926dbee0c9240ca200a8a8297217edf12872de1"} Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.296148 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.296202 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.296255 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.299872 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" event={"ID":"4353213b-b89f-4288-babb-7afef0ca216a","Type":"ContainerStarted","Data":"4068fa2e0cdcdcc4b6536fbc327178178e3bbcdd0ffa78a057a106cbf9c7c49d"} Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.299919 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.305138 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.305183 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.307426 4713 generic.go:334] "Generic (PLEG): container finished" podID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerID="ae8f3f497a7e1f2784f8929d857ad944545fb9d4a5baaa285c02257b2f6619ad" exitCode=143 Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.307474 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerDied","Data":"ae8f3f497a7e1f2784f8929d857ad944545fb9d4a5baaa285c02257b2f6619ad"} Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.307502 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"a30db1efcde21610805d69143819d49bcc2c6fe2cf584a7cc666ffe97ac92b26"} Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.556911 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-25r25" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.696286 4713 patch_prober.go:28] interesting pod/metrics-server-6d5f446985-q8pw2 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.696751 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.696424 4713 patch_prober.go:28] interesting pod/metrics-server-6d5f446985-q8pw2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.697008 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.697059 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.808451 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:19 crc kubenswrapper[4713]: I0314 06:52:19.808463 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.050026 4713 patch_prober.go:28] interesting pod/monitoring-plugin-84469c67d6-74jtt container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.050151 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" podUID="88a15bde-288a-4e1f-b537-7127832ecb65" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.064678 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.270269 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:20 crc kubenswrapper[4713]: timeout: health rpc did not complete within 1s Mar 14 06:52:20 crc kubenswrapper[4713]: > Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.320723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" event={"ID":"8440ca7f-e5cd-4deb-9e52-8be733b65583","Type":"ContainerStarted","Data":"4b63a1d6dacefdb78013f17fc981fa493efe098b97a0ad55d6f65e52491c0cac"} Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.321370 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.323083 4713 generic.go:334] "Generic (PLEG): container finished" podID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerID="d75b4b698387d0513da2b6b82c82fe51ae18108798499ede476fbf54d04cd679" exitCode=0 Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.323169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" event={"ID":"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3","Type":"ContainerDied","Data":"d75b4b698387d0513da2b6b82c82fe51ae18108798499ede476fbf54d04cd679"} Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.328546 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-64h8p" event={"ID":"fb9f27cd-ac40-407e-b9a5-f9594122604f","Type":"ContainerStarted","Data":"113f316b327687cd82050614a55249e3843b56d4296128a52dd9016f371cdc6e"} Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.329111 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.329151 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.329360 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"6a51f0f57d059e36da820b4d22f05bd917ee3f70f54b4ba1f35b3ebf56efeb80"} pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.329404 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" podUID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerName="metrics-server" containerID="cri-o://6a51f0f57d059e36da820b4d22f05bd917ee3f70f54b4ba1f35b3ebf56efeb80" gracePeriod=170 Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.329713 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-64h8p" podUID="fb9f27cd-ac40-407e-b9a5-f9594122604f" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": dial tcp 127.0.0.1:7572: connect: connection refused" Mar 14 06:52:20 crc kubenswrapper[4713]: I0314 06:52:20.808528 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="249e8a7d-5c1c-4d41-a243-6ab0ad96094c" containerName="prometheus" probeResult="failure" output="command timed out" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.066331 4713 patch_prober.go:28] interesting pod/monitoring-plugin-84469c67d6-74jtt container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.066392 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" podUID="88a15bde-288a-4e1f-b537-7127832ecb65" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.082548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.284855 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.284916 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.284969 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.285366 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.285400 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.285459 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.286539 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"a9aac735cfd72f38657e56780f2154074363efe66bccfa6c94322f23718cfc69"} pod="openshift-console-operator/console-operator-58897d9998-cdq27" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.286592 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" containerID="cri-o://a9aac735cfd72f38657e56780f2154074363efe66bccfa6c94322f23718cfc69" gracePeriod=30 Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.314162 4713 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.314265 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.355182 4713 generic.go:334] "Generic (PLEG): container finished" podID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerID="1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766" exitCode=0 Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.355274 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-44kkb" event={"ID":"5a437700-77f6-4838-9a7d-89eda8a27afa","Type":"ContainerDied","Data":"1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766"} Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.371580 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" event={"ID":"b3d449c0-bf37-40e8-9e4c-14f586d1f0b3","Type":"ContainerStarted","Data":"4fb320a383125561c19b6186156f12770e33458a0aff9a2a9af414e396b956af"} Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.372443 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.372530 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.372718 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.581392 4713 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-h7xw6 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.581741 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podUID="82a7870b-ac91-41f9-a94f-41db191e711b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.583627 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.586831 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.586983 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.586865 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wp5sf container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.587045 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" podUID="b00783a2-42c7-45b5-b83d-136c314b0086" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.783814 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" podUID="4128f2c6-d929-4815-8502-291baf22f24f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.784270 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j4474" podUID="fa62dff3-1643-4e94-b31a-d56b21a2327d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.784374 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podUID="12eb62d0-8721-4482-b4a3-148a61cea029" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.784746 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6j4tq" podUID="4128f2c6-d929-4815-8502-291baf22f24f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:21 crc kubenswrapper[4713]: I0314 06:52:21.785351 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-kq9dl" podUID="12eb62d0-8721-4482-b4a3-148a61cea029" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295449 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podUID="46274028-feea-4c48-b086-44533fc3e996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295452 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295613 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" podUID="9da309ad-34cc-4b06-b166-c571b5a39825" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295640 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-shvlw" podUID="46274028-feea-4c48-b086-44533fc3e996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295689 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295824 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295847 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295882 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295909 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.295992 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.296025 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.297626 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"baa3682466c9ae2c7012ba3eaf20f8ed949fab126a3415fca63fafa3b7102d68"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" containerMessage="Container olm-operator failed liveness probe, will be restarted" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.297679 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" containerID="cri-o://baa3682466c9ae2c7012ba3eaf20f8ed949fab126a3415fca63fafa3b7102d68" gracePeriod=30 Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.378386 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.385316 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460376 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-8mm88" podUID="409a2a8b-7e66-4763-9698-3a909f051c50" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460404 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podUID="50d43641-0638-4763-9123-0c0c2c76629e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460505 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460525 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460580 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" podUID="50d43641-0638-4763-9123-0c0c2c76629e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460645 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.460965 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.461033 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.477760 4713 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-rbp8f container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.477838 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-rbp8f" podUID="ed72a9eb-a4ee-430c-9449-566f2c56c3bf" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: E0314 06:52:22.487787 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766 is running failed: container process not found" containerID="1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:22 crc kubenswrapper[4713]: E0314 06:52:22.488812 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766 is running failed: container process not found" containerID="1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:22 crc kubenswrapper[4713]: E0314 06:52:22.489300 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766 is running failed: container process not found" containerID="1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:22 crc kubenswrapper[4713]: E0314 06:52:22.489356 4713 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1199d135f5157aa1dbcf30f7b4669efffd3003995d5af1cc02fd9f845ca9f766 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/openstack-operator-index-44kkb" podUID="5a437700-77f6-4838-9a7d-89eda8a27afa" containerName="registry-server" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.556188 4713 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5d4kx container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.556299 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5d4kx" podUID="1d1f7414-07c8-48ab-bc8b-3892473aa10f" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.585079 4713 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-h7xw6 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.585139 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" podUID="82a7870b-ac91-41f9-a94f-41db191e711b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.708279 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-mwlnd container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.708347 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-mwlnd" podUID="8eed3eb1-25e3-4d02-b5fd-d8f691af6c21" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.715796 4713 patch_prober.go:28] interesting pod/logging-loki-gateway-54c568c9c8-98zs2 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.715857 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-54c568c9c8-98zs2" podUID="d77ba467-d131-42b6-9297-e30cbb7d9c57" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.750408 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.750442 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podUID="aa4ff369-f2af-439f-b9f6-2c8301e80210" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.750502 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" podUID="aa4ff369-f2af-439f-b9f6-2c8301e80210" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.750579 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.750683 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-6xshs" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.836344 4713 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-w65zj container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.836392 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-kv59d" podUID="a55d0754-702d-4dbc-995a-b98d852678ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.836410 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" podUID="0dbd3bea-9644-4bc5-96c7-822b26810706" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.836659 4713 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-w65zj container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.836709 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-w65zj" podUID="0dbd3bea-9644-4bc5-96c7-822b26810706" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.837440 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-hqw7p" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.979441 4713 patch_prober.go:28] interesting pod/console-79df5895fd-4nxm5 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.979448 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" podUID="383e8493-0661-4b45-a72c-5851b520c65b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.979557 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-2r2fz" podUID="383e8493-0661-4b45-a72c-5851b520c65b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.979506 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-79df5895fd-4nxm5" podUID="635fd3d7-4984-4dae-9416-068a4d020d75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.143:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:22 crc kubenswrapper[4713]: I0314 06:52:22.979684 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-nfgb5" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.180638 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.181337 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" podUID="fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.204957 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.254508 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.295496 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-pqnf9" podUID="3941b4bd-470d-4351-aed9-4bc1f90f9ad4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.296580 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.296670 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.320326 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.418119 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-cdq27_8fb6723a-f90c-46d8-a294-d9f916179353/console-operator/0.log" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.418178 4713 generic.go:334] "Generic (PLEG): container finished" podID="8fb6723a-f90c-46d8-a294-d9f916179353" containerID="a9aac735cfd72f38657e56780f2154074363efe66bccfa6c94322f23718cfc69" exitCode=1 Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.418258 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdq27" event={"ID":"8fb6723a-f90c-46d8-a294-d9f916179353","Type":"ContainerDied","Data":"a9aac735cfd72f38657e56780f2154074363efe66bccfa6c94322f23718cfc69"} Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.429832 4713 generic.go:334] "Generic (PLEG): container finished" podID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerID="01f3d0fa33a108eb57c269fe7ac1b047e66c75ed71d13bcce3f53e8a0e985294" exitCode=0 Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.429896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" event={"ID":"540bef84-9032-46b1-951a-9270e9cbbc9a","Type":"ContainerDied","Data":"01f3d0fa33a108eb57c269fe7ac1b047e66c75ed71d13bcce3f53e8a0e985294"} Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.433435 4713 generic.go:334] "Generic (PLEG): container finished" podID="c2c79b18-2189-46d9-bbd4-55f58870d723" containerID="bd0395a1990f66cb6d99e088c548f943054f350eec3636be7b07d8ea6d5e8ac0" exitCode=0 Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.433481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerDied","Data":"bd0395a1990f66cb6d99e088c548f943054f350eec3636be7b07d8ea6d5e8ac0"} Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.434162 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.434264 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.682245 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557852-7mmfv"] Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.688013 4713 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-wnccn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.688102 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.729886 4713 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-wnccn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.730016 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-wnccn" podUID="94841b22-b2eb-4519-b04f-98010d848b46" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.7:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.730125 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-fm6fr" podUID="cbc588fa-b052-4336-81fe-2fed809e251b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:23 crc kubenswrapper[4713]: I0314 06:52:23.747789 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xmbz2" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.062399 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.156978 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.449561 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-cdq27_8fb6723a-f90c-46d8-a294-d9f916179353/console-operator/0.log" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.449896 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdq27" event={"ID":"8fb6723a-f90c-46d8-a294-d9f916179353","Type":"ContainerStarted","Data":"83397d30fd2e20e25dc129291dee560794f991d1270407ec4fca549c35412d7b"} Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.450154 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.450657 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.450692 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.453851 4713 generic.go:334] "Generic (PLEG): container finished" podID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerID="baa3682466c9ae2c7012ba3eaf20f8ed949fab126a3415fca63fafa3b7102d68" exitCode=0 Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.453900 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" event={"ID":"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec","Type":"ContainerDied","Data":"baa3682466c9ae2c7012ba3eaf20f8ed949fab126a3415fca63fafa3b7102d68"} Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.456183 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" event={"ID":"540bef84-9032-46b1-951a-9270e9cbbc9a","Type":"ContainerStarted","Data":"5839741c10a3f3355bb9648a36eead3e2dcf33e6e7e66e637855413580b49899"} Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.456449 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.456760 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.456814 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.458652 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-44kkb" event={"ID":"5a437700-77f6-4838-9a7d-89eda8a27afa","Type":"ContainerStarted","Data":"1ae8a26ef351405c1713a959ce17c2143509b65ef50def1bb1400b642f48de52"} Mar 14 06:52:24 crc kubenswrapper[4713]: I0314 06:52:24.942701 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-64f68cccc7-5r6v2" podUID="129ebe3f-95aa-42f1-8f56-1d3120fb5419" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.305130 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.305508 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.471796 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" event={"ID":"f31c44db-4635-4d5d-8aa5-98be5a6fd0ec","Type":"ContainerStarted","Data":"7ebdf41ca9e49159f1c66580aff79e43ed16500ea22850b32da9450fb9fd738a"} Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473546 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473638 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473678 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473752 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473777 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473892 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.473931 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.559775 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-prgds" podUID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.561072 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-vmvfw" podUID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.561590 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-vmvfw" podUID="e412202e-9dd7-4ebb-90a4-c25cbf3241b8" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.561598 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.561738 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.562744 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.566192 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.582519 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.582597 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-prgds" podUID="58f78f5a-d3da-4bf6-bf82-c98dbbe9602f" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:25 crc kubenswrapper[4713]: > Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.594567 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.594622 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.594635 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.596637 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb"} pod="openshift-marketplace/redhat-marketplace-qhnqn" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.596684 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" containerID="cri-o://9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb" gracePeriod=30 Mar 14 06:52:25 crc kubenswrapper[4713]: E0314 06:52:25.613560 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:25 crc kubenswrapper[4713]: E0314 06:52:25.619500 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:25 crc kubenswrapper[4713]: E0314 06:52:25.656939 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb" cmd=["grpc_health_probe","-addr=:50051"] Mar 14 06:52:25 crc kubenswrapper[4713]: E0314 06:52:25.657020 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.992581 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:25 crc kubenswrapper[4713]: I0314 06:52:25.993192 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-cl7ll" podUID="a5cbbe27-0738-4819-a4bc-5bc7d2945248" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.083545 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.083594 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.083989 4713 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-dhljx container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" start-of-body= Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.084011 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" podUID="b3d449c0-bf37-40e8-9e4c-14f586d1f0b3" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": dial tcp 10.217.0.78:8443: connect: connection refused" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.353721 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.494618 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2c79b18-2189-46d9-bbd4-55f58870d723","Type":"ContainerStarted","Data":"b6fd864adbb2a537f115734866000ce3f2d79b81f0a012355c4f2eebf59034b2"} Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.500646 4713 generic.go:334] "Generic (PLEG): container finished" podID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerID="450c6e52ec6a5edfbb0bb0de38472b0a24f097ed56db00c6e347debcd90953b9" exitCode=0 Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.500745 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" event={"ID":"da4bb664-a24b-4644-9b7a-a0c6eda2c66f","Type":"ContainerDied","Data":"450c6e52ec6a5edfbb0bb0de38472b0a24f097ed56db00c6e347debcd90953b9"} Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.501641 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"59eb6b59ba6e54cb3e5c4aadf922604d1c2324e1b1542e24eab771fe130db99c"} pod="openshift-marketplace/redhat-operators-525cq" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.501683 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" containerID="cri-o://59eb6b59ba6e54cb3e5c4aadf922604d1c2324e1b1542e24eab771fe130db99c" gracePeriod=30 Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.502536 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.502582 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.601582 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" containerID="cri-o://14748d02753ab502f9ae5a5c30c424c7a250193acb463caf9edf740cde85c571" gracePeriod=22 Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.764479 4713 patch_prober.go:28] interesting pod/route-controller-manager-6c9756c5df-htcbt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 14 06:52:26 crc kubenswrapper[4713]: I0314 06:52:26.764537 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" podUID="540bef84-9032-46b1-951a-9270e9cbbc9a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.278965 4713 trace.go:236] Trace[2045885167]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (14-Mar-2026 06:52:23.738) (total time: 3536ms): Mar 14 06:52:27 crc kubenswrapper[4713]: Trace[2045885167]: [3.536767666s] [3.536767666s] END Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.517237 4713 generic.go:334] "Generic (PLEG): container finished" podID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerID="9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb" exitCode=0 Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.517306 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerDied","Data":"9bb08d20a1e1663cd264d06a36c564956bee78bc166d1ff6760d72e4a9475ecb"} Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.522298 4713 generic.go:334] "Generic (PLEG): container finished" podID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerID="59eb6b59ba6e54cb3e5c4aadf922604d1c2324e1b1542e24eab771fe130db99c" exitCode=0 Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.522367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerDied","Data":"59eb6b59ba6e54cb3e5c4aadf922604d1c2324e1b1542e24eab771fe130db99c"} Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.523638 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rlhc9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.523694 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" podUID="f31c44db-4635-4d5d-8aa5-98be5a6fd0ec" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.809707 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerName="galera" probeResult="failure" output="command timed out" Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.875087 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557852-7mmfv"] Mar 14 06:52:27 crc kubenswrapper[4713]: I0314 06:52:27.905670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:52:28 crc kubenswrapper[4713]: W0314 06:52:28.146319 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97afee50_131f_4e19_a172_0020e3607abc.slice/crio-c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0 WatchSource:0}: Error finding container c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0: Status 404 returned error can't find the container with id c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0 Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.304779 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.304831 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.537618 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qhnqn" event={"ID":"fc4d2c5d-cf64-489f-9229-3e79a6e369c3","Type":"ContainerStarted","Data":"26c28a9c0bdac6bbb61e569191ec7d53c8f654577a8ca08871b66ad86c644adf"} Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.541530 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" event={"ID":"da4bb664-a24b-4644-9b7a-a0c6eda2c66f","Type":"ContainerStarted","Data":"d90d61e7ef23f805a3a63df363d4753c49f1e3aab0a4f603d2c651437e76a67f"} Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.541124 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.541619 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.542519 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerStarted","Data":"c51fa660c0a56582884dd4ae4bcc50e5b9366ea2a84ec610ff3c60eb7544fcb5"} Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.544384 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" event={"ID":"97afee50-131f-4e19-a172-0020e3607abc","Type":"ContainerStarted","Data":"c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0"} Mar 14 06:52:28 crc kubenswrapper[4713]: I0314 06:52:28.551841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525cq" event={"ID":"d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd","Type":"ContainerStarted","Data":"980a529c11d23581c02106f6d9d46987d4905375b1aec1d564bf020c5cc8fed9"} Mar 14 06:52:29 crc kubenswrapper[4713]: I0314 06:52:29.565631 4713 generic.go:334] "Generic (PLEG): container finished" podID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerID="d8f57c45086806229d2c6327568f9c70f443a719549e79eccc20bebc0d707fb4" exitCode=0 Mar 14 06:52:29 crc kubenswrapper[4713]: I0314 06:52:29.590024 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 06:52:29 crc kubenswrapper[4713]: I0314 06:52:29.590085 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerDied","Data":"d8f57c45086806229d2c6327568f9c70f443a719549e79eccc20bebc0d707fb4"} Mar 14 06:52:29 crc kubenswrapper[4713]: I0314 06:52:29.646871 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="f6b61a43-5015-4b52-b55f-4ea941db9a0d" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 06:52:29 crc kubenswrapper[4713]: I0314 06:52:29.766568 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-84469c67d6-74jtt" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.285327 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.285579 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.285344 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdq27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.285846 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdq27" podUID="8fb6723a-f90c-46d8-a294-d9f916179353" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.595008 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.595264 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.644664 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.644714 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.804965 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:52:30 crc kubenswrapper[4713]: I0314 06:52:30.805338 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.078138 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-h7xw6" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.277328 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlhc9" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.305268 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.305312 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.305543 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-748rb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.305557 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" podUID="da4bb664-a24b-4644-9b7a-a0c6eda2c66f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.608190 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerStarted","Data":"0ab3e67b0269d167a6e7c98bbe5ea2e66a708306c837341e10b8ca0f9b27e966"} Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.610278 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" event={"ID":"97afee50-131f-4e19-a172-0020e3607abc","Type":"ContainerStarted","Data":"8beb7e79ad21b276b2dd0876c9ff84721fe2dc1996babc27eef7bc0dd5ed1f32"} Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.613978 4713 generic.go:334] "Generic (PLEG): container finished" podID="051e4d6d-86dc-479f-a659-6f95b7baa817" containerID="14748d02753ab502f9ae5a5c30c424c7a250193acb463caf9edf740cde85c571" exitCode=0 Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.614043 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerDied","Data":"14748d02753ab502f9ae5a5c30c424c7a250193acb463caf9edf740cde85c571"} Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.614092 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"051e4d6d-86dc-479f-a659-6f95b7baa817","Type":"ContainerStarted","Data":"c15166c27383c67b9aacfb94fc23be12523ce5677b61c30888a669baa28b63aa"} Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.715559 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wp5sf" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.725241 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" podStartSLOduration=27.69405641 podStartE2EDuration="28.725145498s" podCreationTimestamp="2026-03-14 06:52:03 +0000 UTC" firstStartedPulling="2026-03-14 06:52:28.160877638 +0000 UTC m=+5131.248786938" lastFinishedPulling="2026-03-14 06:52:29.191966726 +0000 UTC m=+5132.279876026" observedRunningTime="2026-03-14 06:52:31.69182157 +0000 UTC m=+5134.779730870" watchObservedRunningTime="2026-03-14 06:52:31.725145498 +0000 UTC m=+5134.813054798" Mar 14 06:52:31 crc kubenswrapper[4713]: I0314 06:52:31.966561 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79df5895fd-4nxm5" Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.090486 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="f6b61a43-5015-4b52-b55f-4ea941db9a0d" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.174850 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6lw5h" Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.478720 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:32 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:32 crc kubenswrapper[4713]: > Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.479467 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:32 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:32 crc kubenswrapper[4713]: > Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.480969 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.481339 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:32 crc kubenswrapper[4713]: I0314 06:52:32.651456 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b665cf668-zl2jw" Mar 14 06:52:33 crc kubenswrapper[4713]: I0314 06:52:33.045550 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:33 crc kubenswrapper[4713]: I0314 06:52:33.327358 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-64h8p" Mar 14 06:52:33 crc kubenswrapper[4713]: I0314 06:52:33.327979 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-84hc6" Mar 14 06:52:33 crc kubenswrapper[4713]: I0314 06:52:33.704444 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-44kkb" Mar 14 06:52:34 crc kubenswrapper[4713]: I0314 06:52:34.309497 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-748rb" Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.074835 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="f6b61a43-5015-4b52-b55f-4ea941db9a0d" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.074917 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.103968 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"549819fc0aca7b042e506cb219f30aa31ecc5117c5843e20676e34ee6700e42e"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.104632 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f6b61a43-5015-4b52-b55f-4ea941db9a0d" containerName="cinder-scheduler" containerID="cri-o://549819fc0aca7b042e506cb219f30aa31ecc5117c5843e20676e34ee6700e42e" gracePeriod=30 Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.659264 4713 generic.go:334] "Generic (PLEG): container finished" podID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerID="1c36a07fee5bbbd358c73c10d9c2f80f6405cf7c8a3a9aca6bc96f355d1febd7" exitCode=0 Mar 14 06:52:35 crc kubenswrapper[4713]: I0314 06:52:35.659351 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerDied","Data":"1c36a07fee5bbbd358c73c10d9c2f80f6405cf7c8a3a9aca6bc96f355d1febd7"} Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.090491 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dhljx" Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.539096 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.539480 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.701170 4713 generic.go:334] "Generic (PLEG): container finished" podID="f6b61a43-5015-4b52-b55f-4ea941db9a0d" containerID="549819fc0aca7b042e506cb219f30aa31ecc5117c5843e20676e34ee6700e42e" exitCode=0 Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.701533 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b61a43-5015-4b52-b55f-4ea941db9a0d","Type":"ContainerDied","Data":"549819fc0aca7b042e506cb219f30aa31ecc5117c5843e20676e34ee6700e42e"} Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.707175 4713 generic.go:334] "Generic (PLEG): container finished" podID="97afee50-131f-4e19-a172-0020e3607abc" containerID="8beb7e79ad21b276b2dd0876c9ff84721fe2dc1996babc27eef7bc0dd5ed1f32" exitCode=0 Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.707276 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" event={"ID":"97afee50-131f-4e19-a172-0020e3607abc","Type":"ContainerDied","Data":"8beb7e79ad21b276b2dd0876c9ff84721fe2dc1996babc27eef7bc0dd5ed1f32"} Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.729398 4713 generic.go:334] "Generic (PLEG): container finished" podID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerID="0ab3e67b0269d167a6e7c98bbe5ea2e66a708306c837341e10b8ca0f9b27e966" exitCode=0 Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.730166 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerDied","Data":"0ab3e67b0269d167a6e7c98bbe5ea2e66a708306c837341e10b8ca0f9b27e966"} Mar 14 06:52:36 crc kubenswrapper[4713]: I0314 06:52:36.776938 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c9756c5df-htcbt" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.629694 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.750835 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerStarted","Data":"03a7f1f220cb3842edd529d73f440b17fbdd54d75f7db0cbc8864168188ec95d"} Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.755658 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerStarted","Data":"f62d443f825315fd7daad91741162a0adf6f0cdeb7f6383ee31470f443bb7a60"} Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.782645 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ncc7b" podStartSLOduration=6.337900684 podStartE2EDuration="51.782618393s" podCreationTimestamp="2026-03-14 06:51:46 +0000 UTC" firstStartedPulling="2026-03-14 06:51:50.733925894 +0000 UTC m=+5093.821835194" lastFinishedPulling="2026-03-14 06:52:36.178643603 +0000 UTC m=+5139.266552903" observedRunningTime="2026-03-14 06:52:37.779672321 +0000 UTC m=+5140.867581631" watchObservedRunningTime="2026-03-14 06:52:37.782618393 +0000 UTC m=+5140.870527693" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.856470 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9tzcq" podStartSLOduration=36.285063014 podStartE2EDuration="43.856443746s" podCreationTimestamp="2026-03-14 06:51:54 +0000 UTC" firstStartedPulling="2026-03-14 06:52:29.585901039 +0000 UTC m=+5132.673810339" lastFinishedPulling="2026-03-14 06:52:37.157281771 +0000 UTC m=+5140.245191071" observedRunningTime="2026-03-14 06:52:37.806747593 +0000 UTC m=+5140.894656893" watchObservedRunningTime="2026-03-14 06:52:37.856443746 +0000 UTC m=+5140.944353066" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.864643 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.864693 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:52:37 crc kubenswrapper[4713]: I0314 06:52:37.926042 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 06:52:39 crc kubenswrapper[4713]: I0314 06:52:39.482128 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:52:39 crc kubenswrapper[4713]: I0314 06:52:39.482806 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:52:39 crc kubenswrapper[4713]: I0314 06:52:39.487332 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9tzcq" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:39 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:39 crc kubenswrapper[4713]: > Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.171893 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" containerID="cri-o://ef5af81019e2cc410dddb8817db8a366260075d08aa35fcae4eb5912ea7b56a4" gracePeriod=15 Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.299399 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cdq27" Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.578904 4713 generic.go:334] "Generic (PLEG): container finished" podID="69f142af-62c3-4d29-8870-be92b4c7216d" containerID="ef5af81019e2cc410dddb8817db8a366260075d08aa35fcae4eb5912ea7b56a4" exitCode=0 Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.580667 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" event={"ID":"69f142af-62c3-4d29-8870-be92b4c7216d","Type":"ContainerDied","Data":"ef5af81019e2cc410dddb8817db8a366260075d08aa35fcae4eb5912ea7b56a4"} Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.594307 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ncc7b" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:40 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:40 crc kubenswrapper[4713]: > Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.863243 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:40 crc kubenswrapper[4713]: I0314 06:52:40.941639 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hqg\" (UniqueName: \"kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg\") pod \"97afee50-131f-4e19-a172-0020e3607abc\" (UID: \"97afee50-131f-4e19-a172-0020e3607abc\") " Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:40.982634 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg" (OuterVolumeSpecName: "kube-api-access-29hqg") pod "97afee50-131f-4e19-a172-0020e3607abc" (UID: "97afee50-131f-4e19-a172-0020e3607abc"). InnerVolumeSpecName "kube-api-access-29hqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.052295 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hqg\" (UniqueName: \"kubernetes.io/projected/97afee50-131f-4e19-a172-0020e3607abc-kube-api-access-29hqg\") on node \"crc\" DevicePath \"\"" Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.619874 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" event={"ID":"97afee50-131f-4e19-a172-0020e3607abc","Type":"ContainerDied","Data":"c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0"} Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.620109 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557852-7mmfv" Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.627480 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f195440b602f93a7a4f739d9cdea3488b1d392b19cd428d5aa23fcf5d4c5b0" Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.627555 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b61a43-5015-4b52-b55f-4ea941db9a0d","Type":"ContainerStarted","Data":"e21b4ba2e00a8176e7673d0dda470d36bd6639f6f5ed981d588da4fb08677f6c"} Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.720573 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:41 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:41 crc kubenswrapper[4713]: > Mar 14 06:52:41 crc kubenswrapper[4713]: I0314 06:52:41.872241 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:41 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:41 crc kubenswrapper[4713]: > Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.034030 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557846-jbmqw"] Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.037319 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.051686 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557846-jbmqw"] Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.659451 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" event={"ID":"69f142af-62c3-4d29-8870-be92b4c7216d","Type":"ContainerStarted","Data":"b630bf40cd53aae26bd52964f6beeb0979993d66edbb7b1b715dad4ab987b6d3"} Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.659793 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.662466 4713 patch_prober.go:28] interesting pod/oauth-openshift-7dc7444945-bll47 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": dial tcp 10.217.0.69:6443: connect: connection refused" start-of-body= Mar 14 06:52:42 crc kubenswrapper[4713]: I0314 06:52:42.662980 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" podUID="69f142af-62c3-4d29-8870-be92b4c7216d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": dial tcp 10.217.0.69:6443: connect: connection refused" Mar 14 06:52:43 crc kubenswrapper[4713]: I0314 06:52:43.577401 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abcaea2-39c2-4f9f-85fe-b51d5e791e17" path="/var/lib/kubelet/pods/6abcaea2-39c2-4f9f-85fe-b51d5e791e17/volumes" Mar 14 06:52:43 crc kubenswrapper[4713]: I0314 06:52:43.673065 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dc7444945-bll47" Mar 14 06:52:47 crc kubenswrapper[4713]: I0314 06:52:47.088699 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 06:52:48 crc kubenswrapper[4713]: I0314 06:52:48.926310 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9tzcq" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:48 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:48 crc kubenswrapper[4713]: > Mar 14 06:52:49 crc kubenswrapper[4713]: I0314 06:52:49.410840 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ncc7b" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:49 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:49 crc kubenswrapper[4713]: > Mar 14 06:52:51 crc kubenswrapper[4713]: I0314 06:52:51.727983 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qhnqn" podUID="fc4d2c5d-cf64-489f-9229-3e79a6e369c3" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:51 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:51 crc kubenswrapper[4713]: > Mar 14 06:52:51 crc kubenswrapper[4713]: I0314 06:52:51.878014 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:51 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:51 crc kubenswrapper[4713]: > Mar 14 06:52:56 crc kubenswrapper[4713]: I0314 06:52:56.327389 4713 scope.go:117] "RemoveContainer" containerID="e7135d1a6f4bc49cc0e885b5c22866db6a625feb52cf053de5ac05e2c11f025c" Mar 14 06:52:58 crc kubenswrapper[4713]: I0314 06:52:58.991388 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9tzcq" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:58 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:58 crc kubenswrapper[4713]: > Mar 14 06:52:59 crc kubenswrapper[4713]: I0314 06:52:59.414401 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ncc7b" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" probeResult="failure" output=< Mar 14 06:52:59 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:52:59 crc kubenswrapper[4713]: > Mar 14 06:53:00 crc kubenswrapper[4713]: I0314 06:53:00.713283 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:53:00 crc kubenswrapper[4713]: I0314 06:53:00.771792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qhnqn" Mar 14 06:53:01 crc kubenswrapper[4713]: I0314 06:53:01.873505 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:01 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:01 crc kubenswrapper[4713]: > Mar 14 06:53:08 crc kubenswrapper[4713]: I0314 06:53:08.448230 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:53:08 crc kubenswrapper[4713]: I0314 06:53:08.535242 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:53:08 crc kubenswrapper[4713]: I0314 06:53:08.961738 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9tzcq" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:08 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:08 crc kubenswrapper[4713]: > Mar 14 06:53:11 crc kubenswrapper[4713]: I0314 06:53:11.859736 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:11 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:11 crc kubenswrapper[4713]: > Mar 14 06:53:13 crc kubenswrapper[4713]: I0314 06:53:13.700718 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:53:13 crc kubenswrapper[4713]: I0314 06:53:13.716314 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ncc7b" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" containerID="cri-o://03a7f1f220cb3842edd529d73f440b17fbdd54d75f7db0cbc8864168188ec95d" gracePeriod=2 Mar 14 06:53:14 crc kubenswrapper[4713]: I0314 06:53:14.048357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerDied","Data":"03a7f1f220cb3842edd529d73f440b17fbdd54d75f7db0cbc8864168188ec95d"} Mar 14 06:53:14 crc kubenswrapper[4713]: I0314 06:53:14.048684 4713 generic.go:334] "Generic (PLEG): container finished" podID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerID="03a7f1f220cb3842edd529d73f440b17fbdd54d75f7db0cbc8864168188ec95d" exitCode=0 Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.209401 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.280588 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzd5z\" (UniqueName: \"kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z\") pod \"066d38a5-380e-465e-912b-2fe268a4b4c4\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.280826 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content\") pod \"066d38a5-380e-465e-912b-2fe268a4b4c4\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.281267 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities\") pod \"066d38a5-380e-465e-912b-2fe268a4b4c4\" (UID: \"066d38a5-380e-465e-912b-2fe268a4b4c4\") " Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.283763 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities" (OuterVolumeSpecName: "utilities") pod "066d38a5-380e-465e-912b-2fe268a4b4c4" (UID: "066d38a5-380e-465e-912b-2fe268a4b4c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.312995 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z" (OuterVolumeSpecName: "kube-api-access-zzd5z") pod "066d38a5-380e-465e-912b-2fe268a4b4c4" (UID: "066d38a5-380e-465e-912b-2fe268a4b4c4"). InnerVolumeSpecName "kube-api-access-zzd5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.385535 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.387112 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzd5z\" (UniqueName: \"kubernetes.io/projected/066d38a5-380e-465e-912b-2fe268a4b4c4-kube-api-access-zzd5z\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.419517 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "066d38a5-380e-465e-912b-2fe268a4b4c4" (UID: "066d38a5-380e-465e-912b-2fe268a4b4c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:53:15 crc kubenswrapper[4713]: I0314 06:53:15.489596 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/066d38a5-380e-465e-912b-2fe268a4b4c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.075248 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncc7b" event={"ID":"066d38a5-380e-465e-912b-2fe268a4b4c4","Type":"ContainerDied","Data":"a455efc45ec47abd843d1ab224feb87877299f3f9925e8f40ca7a3c666852c5d"} Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.075321 4713 scope.go:117] "RemoveContainer" containerID="03a7f1f220cb3842edd529d73f440b17fbdd54d75f7db0cbc8864168188ec95d" Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.075444 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncc7b" Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.105620 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.117125 4713 scope.go:117] "RemoveContainer" containerID="1c36a07fee5bbbd358c73c10d9c2f80f6405cf7c8a3a9aca6bc96f355d1febd7" Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.121417 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ncc7b"] Mar 14 06:53:16 crc kubenswrapper[4713]: I0314 06:53:16.142570 4713 scope.go:117] "RemoveContainer" containerID="aa3e8277457b69aefdd071a90bdc353a450a5b3f3e2bf2c0aae1c8781f25ce2a" Mar 14 06:53:17 crc kubenswrapper[4713]: I0314 06:53:17.577532 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" path="/var/lib/kubelet/pods/066d38a5-380e-465e-912b-2fe268a4b4c4/volumes" Mar 14 06:53:17 crc kubenswrapper[4713]: I0314 06:53:17.927473 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:53:17 crc kubenswrapper[4713]: I0314 06:53:17.976948 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:53:21 crc kubenswrapper[4713]: I0314 06:53:21.878892 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:21 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:21 crc kubenswrapper[4713]: > Mar 14 06:53:23 crc kubenswrapper[4713]: I0314 06:53:23.892696 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:53:23 crc kubenswrapper[4713]: I0314 06:53:23.893604 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9tzcq" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" containerID="cri-o://f62d443f825315fd7daad91741162a0adf6f0cdeb7f6383ee31470f443bb7a60" gracePeriod=2 Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.171680 4713 generic.go:334] "Generic (PLEG): container finished" podID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerID="f62d443f825315fd7daad91741162a0adf6f0cdeb7f6383ee31470f443bb7a60" exitCode=0 Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.171728 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerDied","Data":"f62d443f825315fd7daad91741162a0adf6f0cdeb7f6383ee31470f443bb7a60"} Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.656892 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.747034 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities\") pod \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.747230 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content\") pod \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.747269 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn7q4\" (UniqueName: \"kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4\") pod \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\" (UID: \"8026f430-07d6-4f1a-98ac-b39a9ad6130d\") " Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.750092 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities" (OuterVolumeSpecName: "utilities") pod "8026f430-07d6-4f1a-98ac-b39a9ad6130d" (UID: "8026f430-07d6-4f1a-98ac-b39a9ad6130d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.785412 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4" (OuterVolumeSpecName: "kube-api-access-tn7q4") pod "8026f430-07d6-4f1a-98ac-b39a9ad6130d" (UID: "8026f430-07d6-4f1a-98ac-b39a9ad6130d"). InnerVolumeSpecName "kube-api-access-tn7q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.853138 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.853426 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn7q4\" (UniqueName: \"kubernetes.io/projected/8026f430-07d6-4f1a-98ac-b39a9ad6130d-kube-api-access-tn7q4\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:24 crc kubenswrapper[4713]: I0314 06:53:24.978623 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8026f430-07d6-4f1a-98ac-b39a9ad6130d" (UID: "8026f430-07d6-4f1a-98ac-b39a9ad6130d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.060056 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8026f430-07d6-4f1a-98ac-b39a9ad6130d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.183798 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tzcq" event={"ID":"8026f430-07d6-4f1a-98ac-b39a9ad6130d","Type":"ContainerDied","Data":"c51fa660c0a56582884dd4ae4bcc50e5b9366ea2a84ec610ff3c60eb7544fcb5"} Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.183845 4713 scope.go:117] "RemoveContainer" containerID="f62d443f825315fd7daad91741162a0adf6f0cdeb7f6383ee31470f443bb7a60" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.183988 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tzcq" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.228870 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.230771 4713 scope.go:117] "RemoveContainer" containerID="0ab3e67b0269d167a6e7c98bbe5ea2e66a708306c837341e10b8ca0f9b27e966" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.246577 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9tzcq"] Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.577730 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" path="/var/lib/kubelet/pods/8026f430-07d6-4f1a-98ac-b39a9ad6130d/volumes" Mar 14 06:53:25 crc kubenswrapper[4713]: I0314 06:53:25.835794 4713 scope.go:117] "RemoveContainer" containerID="d8f57c45086806229d2c6327568f9c70f443a719549e79eccc20bebc0d707fb4" Mar 14 06:53:31 crc kubenswrapper[4713]: I0314 06:53:31.856311 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:31 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:31 crc kubenswrapper[4713]: > Mar 14 06:53:41 crc kubenswrapper[4713]: I0314 06:53:41.963637 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:41 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:41 crc kubenswrapper[4713]: > Mar 14 06:53:52 crc kubenswrapper[4713]: I0314 06:53:52.545526 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-525cq" podUID="d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd" containerName="registry-server" probeResult="failure" output=< Mar 14 06:53:52 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:53:52 crc kubenswrapper[4713]: > Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.343565 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557854-5n4zr"] Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351713 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="extract-content" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351752 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="extract-content" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351784 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97afee50-131f-4e19-a172-0020e3607abc" containerName="oc" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351795 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97afee50-131f-4e19-a172-0020e3607abc" containerName="oc" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351817 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="extract-content" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351824 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="extract-content" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351840 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="extract-utilities" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351847 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="extract-utilities" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351870 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351876 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351897 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351903 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: E0314 06:54:00.351917 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="extract-utilities" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.351923 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="extract-utilities" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.352885 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="066d38a5-380e-465e-912b-2fe268a4b4c4" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.352935 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="97afee50-131f-4e19-a172-0020e3607abc" containerName="oc" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.352948 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8026f430-07d6-4f1a-98ac-b39a9ad6130d" containerName="registry-server" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.359993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.376279 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.376338 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.376609 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.404534 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88x8v\" (UniqueName: \"kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v\") pod \"auto-csr-approver-29557854-5n4zr\" (UID: \"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0\") " pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.456154 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557854-5n4zr"] Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.506531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88x8v\" (UniqueName: \"kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v\") pod \"auto-csr-approver-29557854-5n4zr\" (UID: \"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0\") " pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.549636 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88x8v\" (UniqueName: \"kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v\") pod \"auto-csr-approver-29557854-5n4zr\" (UID: \"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0\") " pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.693841 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.870282 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:54:00 crc kubenswrapper[4713]: I0314 06:54:00.933005 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-525cq" Mar 14 06:54:02 crc kubenswrapper[4713]: I0314 06:54:02.042882 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557854-5n4zr"] Mar 14 06:54:02 crc kubenswrapper[4713]: I0314 06:54:02.575572 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" event={"ID":"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0","Type":"ContainerStarted","Data":"d887edbf95292ae11213b63ea6ee6b5e192cfc6d32bc28e8decbcd002c9b6bd3"} Mar 14 06:54:04 crc kubenswrapper[4713]: I0314 06:54:04.598572 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" event={"ID":"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0","Type":"ContainerStarted","Data":"3806360501eea56d95b268cf9a52a33a41381b4f735e6c11a2cc23e2b4864145"} Mar 14 06:54:04 crc kubenswrapper[4713]: I0314 06:54:04.621709 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" podStartSLOduration=3.7530733229999997 podStartE2EDuration="4.617917171s" podCreationTimestamp="2026-03-14 06:54:00 +0000 UTC" firstStartedPulling="2026-03-14 06:54:02.083791418 +0000 UTC m=+5225.171700708" lastFinishedPulling="2026-03-14 06:54:02.948635266 +0000 UTC m=+5226.036544556" observedRunningTime="2026-03-14 06:54:04.614581606 +0000 UTC m=+5227.702490906" watchObservedRunningTime="2026-03-14 06:54:04.617917171 +0000 UTC m=+5227.705826471" Mar 14 06:54:05 crc kubenswrapper[4713]: I0314 06:54:05.611445 4713 generic.go:334] "Generic (PLEG): container finished" podID="6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" containerID="3806360501eea56d95b268cf9a52a33a41381b4f735e6c11a2cc23e2b4864145" exitCode=0 Mar 14 06:54:05 crc kubenswrapper[4713]: I0314 06:54:05.611557 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" event={"ID":"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0","Type":"ContainerDied","Data":"3806360501eea56d95b268cf9a52a33a41381b4f735e6c11a2cc23e2b4864145"} Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.182792 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.210893 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88x8v\" (UniqueName: \"kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v\") pod \"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0\" (UID: \"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0\") " Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.230420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v" (OuterVolumeSpecName: "kube-api-access-88x8v") pod "6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" (UID: "6a4ad9d9-16f3-4eec-9db5-3338559c4bb0"). InnerVolumeSpecName "kube-api-access-88x8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.313016 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88x8v\" (UniqueName: \"kubernetes.io/projected/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0-kube-api-access-88x8v\") on node \"crc\" DevicePath \"\"" Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.634313 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" event={"ID":"6a4ad9d9-16f3-4eec-9db5-3338559c4bb0","Type":"ContainerDied","Data":"d887edbf95292ae11213b63ea6ee6b5e192cfc6d32bc28e8decbcd002c9b6bd3"} Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.634518 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557854-5n4zr" Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.634710 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d887edbf95292ae11213b63ea6ee6b5e192cfc6d32bc28e8decbcd002c9b6bd3" Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.700575 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557848-qssgx"] Mar 14 06:54:07 crc kubenswrapper[4713]: I0314 06:54:07.710335 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557848-qssgx"] Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.142415 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:08 crc kubenswrapper[4713]: E0314 06:54:08.143190 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" containerName="oc" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.143219 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" containerName="oc" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.143504 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" containerName="oc" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.145708 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.161636 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.336099 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n96vr\" (UniqueName: \"kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.336186 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.336817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.439615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.439805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.439899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n96vr\" (UniqueName: \"kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.440728 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.440786 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.463699 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n96vr\" (UniqueName: \"kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr\") pod \"redhat-marketplace-gnv2l\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:08 crc kubenswrapper[4713]: I0314 06:54:08.532807 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:09 crc kubenswrapper[4713]: I0314 06:54:09.012669 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:09 crc kubenswrapper[4713]: W0314 06:54:09.018501 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df835a9_5c77_441e_83d6_065de7358c18.slice/crio-69890c672d8e8c24a2b6c714c44250975d614a022cc49c60396f36a05a5b086d WatchSource:0}: Error finding container 69890c672d8e8c24a2b6c714c44250975d614a022cc49c60396f36a05a5b086d: Status 404 returned error can't find the container with id 69890c672d8e8c24a2b6c714c44250975d614a022cc49c60396f36a05a5b086d Mar 14 06:54:09 crc kubenswrapper[4713]: I0314 06:54:09.578801 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d072e9-8b3c-4ffd-925c-d908ada34f3d" path="/var/lib/kubelet/pods/93d072e9-8b3c-4ffd-925c-d908ada34f3d/volumes" Mar 14 06:54:09 crc kubenswrapper[4713]: I0314 06:54:09.657460 4713 generic.go:334] "Generic (PLEG): container finished" podID="2df835a9-5c77-441e-83d6-065de7358c18" containerID="5e5b35cf5f7037f79196181772eeb7c5c74da87729de91e219f0e829944c6baa" exitCode=0 Mar 14 06:54:09 crc kubenswrapper[4713]: I0314 06:54:09.657511 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerDied","Data":"5e5b35cf5f7037f79196181772eeb7c5c74da87729de91e219f0e829944c6baa"} Mar 14 06:54:09 crc kubenswrapper[4713]: I0314 06:54:09.657541 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerStarted","Data":"69890c672d8e8c24a2b6c714c44250975d614a022cc49c60396f36a05a5b086d"} Mar 14 06:54:10 crc kubenswrapper[4713]: I0314 06:54:10.679662 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerStarted","Data":"bf6f0f2bbf65c5dc2467575be7e74fc1fe4bd6a51973125a2beaab19ab022ae2"} Mar 14 06:54:12 crc kubenswrapper[4713]: I0314 06:54:12.700709 4713 generic.go:334] "Generic (PLEG): container finished" podID="2df835a9-5c77-441e-83d6-065de7358c18" containerID="bf6f0f2bbf65c5dc2467575be7e74fc1fe4bd6a51973125a2beaab19ab022ae2" exitCode=0 Mar 14 06:54:12 crc kubenswrapper[4713]: I0314 06:54:12.700815 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerDied","Data":"bf6f0f2bbf65c5dc2467575be7e74fc1fe4bd6a51973125a2beaab19ab022ae2"} Mar 14 06:54:13 crc kubenswrapper[4713]: I0314 06:54:13.738571 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerStarted","Data":"4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e"} Mar 14 06:54:13 crc kubenswrapper[4713]: I0314 06:54:13.767056 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gnv2l" podStartSLOduration=2.330790605 podStartE2EDuration="5.767027828s" podCreationTimestamp="2026-03-14 06:54:08 +0000 UTC" firstStartedPulling="2026-03-14 06:54:09.659707173 +0000 UTC m=+5232.747616463" lastFinishedPulling="2026-03-14 06:54:13.095944386 +0000 UTC m=+5236.183853686" observedRunningTime="2026-03-14 06:54:13.75946889 +0000 UTC m=+5236.847378200" watchObservedRunningTime="2026-03-14 06:54:13.767027828 +0000 UTC m=+5236.854937128" Mar 14 06:54:18 crc kubenswrapper[4713]: I0314 06:54:18.534164 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:18 crc kubenswrapper[4713]: I0314 06:54:18.534451 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:19 crc kubenswrapper[4713]: I0314 06:54:19.592356 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gnv2l" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="registry-server" probeResult="failure" output=< Mar 14 06:54:19 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:54:19 crc kubenswrapper[4713]: > Mar 14 06:54:29 crc kubenswrapper[4713]: I0314 06:54:29.055673 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:29 crc kubenswrapper[4713]: I0314 06:54:29.109467 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:29 crc kubenswrapper[4713]: I0314 06:54:29.302707 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:30 crc kubenswrapper[4713]: I0314 06:54:30.937658 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gnv2l" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="registry-server" containerID="cri-o://4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e" gracePeriod=2 Mar 14 06:54:31 crc kubenswrapper[4713]: E0314 06:54:31.391529 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df835a9_5c77_441e_83d6_065de7358c18.slice/crio-conmon-4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e.scope\": RecentStats: unable to find data in memory cache]" Mar 14 06:54:31 crc kubenswrapper[4713]: I0314 06:54:31.952117 4713 generic.go:334] "Generic (PLEG): container finished" podID="2df835a9-5c77-441e-83d6-065de7358c18" containerID="4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e" exitCode=0 Mar 14 06:54:31 crc kubenswrapper[4713]: I0314 06:54:31.952424 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerDied","Data":"4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e"} Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.324652 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.452311 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n96vr\" (UniqueName: \"kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr\") pod \"2df835a9-5c77-441e-83d6-065de7358c18\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.452395 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities\") pod \"2df835a9-5c77-441e-83d6-065de7358c18\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.452634 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content\") pod \"2df835a9-5c77-441e-83d6-065de7358c18\" (UID: \"2df835a9-5c77-441e-83d6-065de7358c18\") " Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.455631 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities" (OuterVolumeSpecName: "utilities") pod "2df835a9-5c77-441e-83d6-065de7358c18" (UID: "2df835a9-5c77-441e-83d6-065de7358c18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.479767 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr" (OuterVolumeSpecName: "kube-api-access-n96vr") pod "2df835a9-5c77-441e-83d6-065de7358c18" (UID: "2df835a9-5c77-441e-83d6-065de7358c18"). InnerVolumeSpecName "kube-api-access-n96vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.527222 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2df835a9-5c77-441e-83d6-065de7358c18" (UID: "2df835a9-5c77-441e-83d6-065de7358c18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.556273 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.556317 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n96vr\" (UniqueName: \"kubernetes.io/projected/2df835a9-5c77-441e-83d6-065de7358c18-kube-api-access-n96vr\") on node \"crc\" DevicePath \"\"" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.556330 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df835a9-5c77-441e-83d6-065de7358c18-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.965331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnv2l" event={"ID":"2df835a9-5c77-441e-83d6-065de7358c18","Type":"ContainerDied","Data":"69890c672d8e8c24a2b6c714c44250975d614a022cc49c60396f36a05a5b086d"} Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.965400 4713 scope.go:117] "RemoveContainer" containerID="4b184a0ff04f0d9e4e3ebac4fad3ab4b84ffb4e32529569a0efc579d13bda98e" Mar 14 06:54:32 crc kubenswrapper[4713]: I0314 06:54:32.965419 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnv2l" Mar 14 06:54:33 crc kubenswrapper[4713]: I0314 06:54:33.003923 4713 scope.go:117] "RemoveContainer" containerID="bf6f0f2bbf65c5dc2467575be7e74fc1fe4bd6a51973125a2beaab19ab022ae2" Mar 14 06:54:33 crc kubenswrapper[4713]: I0314 06:54:33.023389 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:33 crc kubenswrapper[4713]: I0314 06:54:33.029895 4713 scope.go:117] "RemoveContainer" containerID="5e5b35cf5f7037f79196181772eeb7c5c74da87729de91e219f0e829944c6baa" Mar 14 06:54:33 crc kubenswrapper[4713]: I0314 06:54:33.036467 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnv2l"] Mar 14 06:54:33 crc kubenswrapper[4713]: I0314 06:54:33.575453 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df835a9-5c77-441e-83d6-065de7358c18" path="/var/lib/kubelet/pods/2df835a9-5c77-441e-83d6-065de7358c18/volumes" Mar 14 06:54:40 crc kubenswrapper[4713]: I0314 06:54:40.731657 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:54:40 crc kubenswrapper[4713]: I0314 06:54:40.732990 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:54:51 crc kubenswrapper[4713]: I0314 06:54:51.153222 4713 generic.go:334] "Generic (PLEG): container finished" podID="d7cea157-995e-400c-b2ee-85357ae7fb7b" containerID="6a51f0f57d059e36da820b4d22f05bd917ee3f70f54b4ba1f35b3ebf56efeb80" exitCode=0 Mar 14 06:54:51 crc kubenswrapper[4713]: I0314 06:54:51.154429 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" event={"ID":"d7cea157-995e-400c-b2ee-85357ae7fb7b","Type":"ContainerDied","Data":"6a51f0f57d059e36da820b4d22f05bd917ee3f70f54b4ba1f35b3ebf56efeb80"} Mar 14 06:54:52 crc kubenswrapper[4713]: I0314 06:54:52.170762 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" event={"ID":"d7cea157-995e-400c-b2ee-85357ae7fb7b","Type":"ContainerStarted","Data":"a7dc7a9285c10996a51bf8d61c60859a4fc4f65a92c5acdad7b334f848258f28"} Mar 14 06:54:56 crc kubenswrapper[4713]: I0314 06:54:56.823172 4713 scope.go:117] "RemoveContainer" containerID="6c79b52545c45df99b51f2edb6c404c1996b31d9f681258c10adbf6d6dc53c70" Mar 14 06:54:58 crc kubenswrapper[4713]: I0314 06:54:58.695361 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 06:54:58 crc kubenswrapper[4713]: I0314 06:54:58.695684 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 06:55:10 crc kubenswrapper[4713]: I0314 06:55:10.732243 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:55:10 crc kubenswrapper[4713]: I0314 06:55:10.733114 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:55:18 crc kubenswrapper[4713]: I0314 06:55:18.707789 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 06:55:18 crc kubenswrapper[4713]: I0314 06:55:18.713017 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6d5f446985-q8pw2" Mar 14 06:55:40 crc kubenswrapper[4713]: I0314 06:55:40.732029 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 06:55:40 crc kubenswrapper[4713]: I0314 06:55:40.732578 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 06:55:40 crc kubenswrapper[4713]: I0314 06:55:40.732626 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 06:55:40 crc kubenswrapper[4713]: I0314 06:55:40.733645 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 06:55:40 crc kubenswrapper[4713]: I0314 06:55:40.733713 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" gracePeriod=600 Mar 14 06:55:41 crc kubenswrapper[4713]: I0314 06:55:41.680068 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" exitCode=0 Mar 14 06:55:41 crc kubenswrapper[4713]: I0314 06:55:41.680145 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4"} Mar 14 06:55:41 crc kubenswrapper[4713]: I0314 06:55:41.680445 4713 scope.go:117] "RemoveContainer" containerID="47a21fc91748367e2fa2fed03723a1e1afd08f67eeb72f7a50138295a534bf0e" Mar 14 06:55:41 crc kubenswrapper[4713]: E0314 06:55:41.758726 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:55:42 crc kubenswrapper[4713]: I0314 06:55:42.693097 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:55:42 crc kubenswrapper[4713]: E0314 06:55:42.693922 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:55:53 crc kubenswrapper[4713]: I0314 06:55:53.565512 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:55:53 crc kubenswrapper[4713]: E0314 06:55:53.566634 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.142509 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557856-kx78m"] Mar 14 06:56:00 crc kubenswrapper[4713]: E0314 06:56:00.143624 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="extract-content" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.143640 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="extract-content" Mar 14 06:56:00 crc kubenswrapper[4713]: E0314 06:56:00.143686 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="registry-server" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.143695 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="registry-server" Mar 14 06:56:00 crc kubenswrapper[4713]: E0314 06:56:00.143706 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="extract-utilities" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.143713 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="extract-utilities" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.143915 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df835a9-5c77-441e-83d6-065de7358c18" containerName="registry-server" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.176117 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557856-kx78m"] Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.176236 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.179674 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.179772 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.179887 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.283973 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5r57\" (UniqueName: \"kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57\") pod \"auto-csr-approver-29557856-kx78m\" (UID: \"27956939-a5c2-4749-a280-b1ed90490162\") " pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.387408 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5r57\" (UniqueName: \"kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57\") pod \"auto-csr-approver-29557856-kx78m\" (UID: \"27956939-a5c2-4749-a280-b1ed90490162\") " pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.410295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5r57\" (UniqueName: \"kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57\") pod \"auto-csr-approver-29557856-kx78m\" (UID: \"27956939-a5c2-4749-a280-b1ed90490162\") " pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:00 crc kubenswrapper[4713]: I0314 06:56:00.495198 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:01 crc kubenswrapper[4713]: I0314 06:56:01.435550 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557856-kx78m"] Mar 14 06:56:01 crc kubenswrapper[4713]: I0314 06:56:01.887670 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557856-kx78m" event={"ID":"27956939-a5c2-4749-a280-b1ed90490162","Type":"ContainerStarted","Data":"903ff9c5d1c371687190c557189bf7aa6e0b002384ad5c9b0578941e5d806e14"} Mar 14 06:56:04 crc kubenswrapper[4713]: I0314 06:56:04.921151 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557856-kx78m" event={"ID":"27956939-a5c2-4749-a280-b1ed90490162","Type":"ContainerStarted","Data":"79f339aea60b4b139cd236346c0f132c6acf4b515103ce663dd6dd89668bcfb2"} Mar 14 06:56:04 crc kubenswrapper[4713]: I0314 06:56:04.943613 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557856-kx78m" podStartSLOduration=3.514101268 podStartE2EDuration="4.943590138s" podCreationTimestamp="2026-03-14 06:56:00 +0000 UTC" firstStartedPulling="2026-03-14 06:56:01.427981389 +0000 UTC m=+5344.515890689" lastFinishedPulling="2026-03-14 06:56:02.857470259 +0000 UTC m=+5345.945379559" observedRunningTime="2026-03-14 06:56:04.935680649 +0000 UTC m=+5348.023589959" watchObservedRunningTime="2026-03-14 06:56:04.943590138 +0000 UTC m=+5348.031499438" Mar 14 06:56:06 crc kubenswrapper[4713]: I0314 06:56:06.943738 4713 generic.go:334] "Generic (PLEG): container finished" podID="27956939-a5c2-4749-a280-b1ed90490162" containerID="79f339aea60b4b139cd236346c0f132c6acf4b515103ce663dd6dd89668bcfb2" exitCode=0 Mar 14 06:56:06 crc kubenswrapper[4713]: I0314 06:56:06.943834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557856-kx78m" event={"ID":"27956939-a5c2-4749-a280-b1ed90490162","Type":"ContainerDied","Data":"79f339aea60b4b139cd236346c0f132c6acf4b515103ce663dd6dd89668bcfb2"} Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.378466 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.490722 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5r57\" (UniqueName: \"kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57\") pod \"27956939-a5c2-4749-a280-b1ed90490162\" (UID: \"27956939-a5c2-4749-a280-b1ed90490162\") " Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.499810 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57" (OuterVolumeSpecName: "kube-api-access-n5r57") pod "27956939-a5c2-4749-a280-b1ed90490162" (UID: "27956939-a5c2-4749-a280-b1ed90490162"). InnerVolumeSpecName "kube-api-access-n5r57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.563999 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:56:08 crc kubenswrapper[4713]: E0314 06:56:08.564515 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.594193 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5r57\" (UniqueName: \"kubernetes.io/projected/27956939-a5c2-4749-a280-b1ed90490162-kube-api-access-n5r57\") on node \"crc\" DevicePath \"\"" Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.967012 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557856-kx78m" event={"ID":"27956939-a5c2-4749-a280-b1ed90490162","Type":"ContainerDied","Data":"903ff9c5d1c371687190c557189bf7aa6e0b002384ad5c9b0578941e5d806e14"} Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.967300 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="903ff9c5d1c371687190c557189bf7aa6e0b002384ad5c9b0578941e5d806e14" Mar 14 06:56:08 crc kubenswrapper[4713]: I0314 06:56:08.967096 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557856-kx78m" Mar 14 06:56:09 crc kubenswrapper[4713]: I0314 06:56:09.035875 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557850-lvfxz"] Mar 14 06:56:09 crc kubenswrapper[4713]: I0314 06:56:09.048728 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557850-lvfxz"] Mar 14 06:56:09 crc kubenswrapper[4713]: I0314 06:56:09.579909 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f54224a-a1ed-4d4d-a9e1-d0d714a114fc" path="/var/lib/kubelet/pods/1f54224a-a1ed-4d4d-a9e1-d0d714a114fc/volumes" Mar 14 06:56:22 crc kubenswrapper[4713]: I0314 06:56:22.564800 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:56:22 crc kubenswrapper[4713]: E0314 06:56:22.565710 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:33 crc kubenswrapper[4713]: I0314 06:56:33.564563 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:56:33 crc kubenswrapper[4713]: E0314 06:56:33.565749 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:45 crc kubenswrapper[4713]: I0314 06:56:45.563580 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:56:45 crc kubenswrapper[4713]: E0314 06:56:45.564360 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:57 crc kubenswrapper[4713]: I0314 06:56:57.086149 4713 scope.go:117] "RemoveContainer" containerID="d71104223dbf9074ffb3c1a436be1abfd56de7afa932f922155cf1bee5bb298b" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.493743 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:56:59 crc kubenswrapper[4713]: E0314 06:56:59.494696 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27956939-a5c2-4749-a280-b1ed90490162" containerName="oc" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.494709 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="27956939-a5c2-4749-a280-b1ed90490162" containerName="oc" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.494964 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="27956939-a5c2-4749-a280-b1ed90490162" containerName="oc" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.496870 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.513952 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.570556 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:56:59 crc kubenswrapper[4713]: E0314 06:56:59.570829 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.637410 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.638264 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.638373 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9t4h\" (UniqueName: \"kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.739964 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.740252 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9t4h\" (UniqueName: \"kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.740349 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.740841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.741306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.761117 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9t4h\" (UniqueName: \"kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h\") pod \"redhat-operators-5fdwr\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:56:59 crc kubenswrapper[4713]: I0314 06:56:59.822385 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:57:00 crc kubenswrapper[4713]: I0314 06:57:00.391647 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:57:01 crc kubenswrapper[4713]: I0314 06:57:01.529894 4713 generic.go:334] "Generic (PLEG): container finished" podID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerID="bf71d2552d71c07e1f181167d5f9204719ec7860a8ae443a07f446d89e25491f" exitCode=0 Mar 14 06:57:01 crc kubenswrapper[4713]: I0314 06:57:01.530295 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerDied","Data":"bf71d2552d71c07e1f181167d5f9204719ec7860a8ae443a07f446d89e25491f"} Mar 14 06:57:01 crc kubenswrapper[4713]: I0314 06:57:01.530616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerStarted","Data":"79bbd413a82610aaee61f2dfadaf7a040c358d6ae505523310e51152a17f3a4d"} Mar 14 06:57:01 crc kubenswrapper[4713]: I0314 06:57:01.533011 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 06:57:04 crc kubenswrapper[4713]: I0314 06:57:04.564001 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerStarted","Data":"38fcda0c1b1a9e67b2917bdbf02c82624af795475dbb0d3b6e0c23c258cc4f8c"} Mar 14 06:57:10 crc kubenswrapper[4713]: I0314 06:57:10.631352 4713 generic.go:334] "Generic (PLEG): container finished" podID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerID="38fcda0c1b1a9e67b2917bdbf02c82624af795475dbb0d3b6e0c23c258cc4f8c" exitCode=0 Mar 14 06:57:10 crc kubenswrapper[4713]: I0314 06:57:10.631420 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerDied","Data":"38fcda0c1b1a9e67b2917bdbf02c82624af795475dbb0d3b6e0c23c258cc4f8c"} Mar 14 06:57:11 crc kubenswrapper[4713]: I0314 06:57:11.644481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerStarted","Data":"96833d8769693dd706b424d5bc519df2328c4f50867c6c17d9588b970021af6c"} Mar 14 06:57:11 crc kubenswrapper[4713]: I0314 06:57:11.661464 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fdwr" podStartSLOduration=2.883094547 podStartE2EDuration="12.66144318s" podCreationTimestamp="2026-03-14 06:56:59 +0000 UTC" firstStartedPulling="2026-03-14 06:57:01.532729185 +0000 UTC m=+5404.620638485" lastFinishedPulling="2026-03-14 06:57:11.311077818 +0000 UTC m=+5414.398987118" observedRunningTime="2026-03-14 06:57:11.657876748 +0000 UTC m=+5414.745786048" watchObservedRunningTime="2026-03-14 06:57:11.66144318 +0000 UTC m=+5414.749352480" Mar 14 06:57:14 crc kubenswrapper[4713]: I0314 06:57:14.564667 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:57:14 crc kubenswrapper[4713]: E0314 06:57:14.565523 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:57:19 crc kubenswrapper[4713]: I0314 06:57:19.824561 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:57:19 crc kubenswrapper[4713]: I0314 06:57:19.825138 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:57:21 crc kubenswrapper[4713]: I0314 06:57:21.035009 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" probeResult="failure" output=< Mar 14 06:57:21 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:57:21 crc kubenswrapper[4713]: > Mar 14 06:57:28 crc kubenswrapper[4713]: I0314 06:57:28.565193 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:57:28 crc kubenswrapper[4713]: E0314 06:57:28.566669 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:57:31 crc kubenswrapper[4713]: I0314 06:57:31.551946 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" probeResult="failure" output=< Mar 14 06:57:31 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:57:31 crc kubenswrapper[4713]: > Mar 14 06:57:40 crc kubenswrapper[4713]: I0314 06:57:40.883151 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" probeResult="failure" output=< Mar 14 06:57:40 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:57:40 crc kubenswrapper[4713]: > Mar 14 06:57:41 crc kubenswrapper[4713]: I0314 06:57:41.564570 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:57:41 crc kubenswrapper[4713]: E0314 06:57:41.564832 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:57:50 crc kubenswrapper[4713]: I0314 06:57:50.871682 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" probeResult="failure" output=< Mar 14 06:57:50 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:57:50 crc kubenswrapper[4713]: > Mar 14 06:57:52 crc kubenswrapper[4713]: I0314 06:57:52.563774 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:57:52 crc kubenswrapper[4713]: E0314 06:57:52.564498 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.159609 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557858-qhtj8"] Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.161807 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.163450 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557858-qhtj8"] Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.185138 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.185489 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.199703 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.220347 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfzz\" (UniqueName: \"kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz\") pod \"auto-csr-approver-29557858-qhtj8\" (UID: \"9040d059-f826-4f51-a208-f291f5063f00\") " pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.322461 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbfzz\" (UniqueName: \"kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz\") pod \"auto-csr-approver-29557858-qhtj8\" (UID: \"9040d059-f826-4f51-a208-f291f5063f00\") " pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.347696 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbfzz\" (UniqueName: \"kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz\") pod \"auto-csr-approver-29557858-qhtj8\" (UID: \"9040d059-f826-4f51-a208-f291f5063f00\") " pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.507383 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:00 crc kubenswrapper[4713]: I0314 06:58:00.881118 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" probeResult="failure" output=< Mar 14 06:58:00 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 06:58:00 crc kubenswrapper[4713]: > Mar 14 06:58:01 crc kubenswrapper[4713]: I0314 06:58:01.099725 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557858-qhtj8"] Mar 14 06:58:01 crc kubenswrapper[4713]: I0314 06:58:01.252564 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" event={"ID":"9040d059-f826-4f51-a208-f291f5063f00","Type":"ContainerStarted","Data":"7be98fd6594a94137064166518a701d0f2ebb114cb995f53917a32816143f049"} Mar 14 06:58:03 crc kubenswrapper[4713]: I0314 06:58:03.289791 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" event={"ID":"9040d059-f826-4f51-a208-f291f5063f00","Type":"ContainerStarted","Data":"a3a22fa53360b13bc01788cdbf51588c87b40635ddc9aef83cd69a1f7a32f02a"} Mar 14 06:58:03 crc kubenswrapper[4713]: I0314 06:58:03.311043 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" podStartSLOduration=2.500461671 podStartE2EDuration="3.311025142s" podCreationTimestamp="2026-03-14 06:58:00 +0000 UTC" firstStartedPulling="2026-03-14 06:58:01.14915197 +0000 UTC m=+5464.237061270" lastFinishedPulling="2026-03-14 06:58:01.959715441 +0000 UTC m=+5465.047624741" observedRunningTime="2026-03-14 06:58:03.30365984 +0000 UTC m=+5466.391569140" watchObservedRunningTime="2026-03-14 06:58:03.311025142 +0000 UTC m=+5466.398934442" Mar 14 06:58:03 crc kubenswrapper[4713]: I0314 06:58:03.563670 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:58:03 crc kubenswrapper[4713]: E0314 06:58:03.563996 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:04 crc kubenswrapper[4713]: I0314 06:58:04.301576 4713 generic.go:334] "Generic (PLEG): container finished" podID="9040d059-f826-4f51-a208-f291f5063f00" containerID="a3a22fa53360b13bc01788cdbf51588c87b40635ddc9aef83cd69a1f7a32f02a" exitCode=0 Mar 14 06:58:04 crc kubenswrapper[4713]: I0314 06:58:04.301681 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" event={"ID":"9040d059-f826-4f51-a208-f291f5063f00","Type":"ContainerDied","Data":"a3a22fa53360b13bc01788cdbf51588c87b40635ddc9aef83cd69a1f7a32f02a"} Mar 14 06:58:05 crc kubenswrapper[4713]: I0314 06:58:05.765845 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:05 crc kubenswrapper[4713]: I0314 06:58:05.901515 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbfzz\" (UniqueName: \"kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz\") pod \"9040d059-f826-4f51-a208-f291f5063f00\" (UID: \"9040d059-f826-4f51-a208-f291f5063f00\") " Mar 14 06:58:05 crc kubenswrapper[4713]: I0314 06:58:05.967853 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz" (OuterVolumeSpecName: "kube-api-access-zbfzz") pod "9040d059-f826-4f51-a208-f291f5063f00" (UID: "9040d059-f826-4f51-a208-f291f5063f00"). InnerVolumeSpecName "kube-api-access-zbfzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.006402 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbfzz\" (UniqueName: \"kubernetes.io/projected/9040d059-f826-4f51-a208-f291f5063f00-kube-api-access-zbfzz\") on node \"crc\" DevicePath \"\"" Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.351252 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" event={"ID":"9040d059-f826-4f51-a208-f291f5063f00","Type":"ContainerDied","Data":"7be98fd6594a94137064166518a701d0f2ebb114cb995f53917a32816143f049"} Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.351487 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7be98fd6594a94137064166518a701d0f2ebb114cb995f53917a32816143f049" Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.351881 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557858-qhtj8" Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.379444 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557852-7mmfv"] Mar 14 06:58:06 crc kubenswrapper[4713]: I0314 06:58:06.390669 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557852-7mmfv"] Mar 14 06:58:07 crc kubenswrapper[4713]: I0314 06:58:07.584665 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97afee50-131f-4e19-a172-0020e3607abc" path="/var/lib/kubelet/pods/97afee50-131f-4e19-a172-0020e3607abc/volumes" Mar 14 06:58:09 crc kubenswrapper[4713]: I0314 06:58:09.880051 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:58:09 crc kubenswrapper[4713]: I0314 06:58:09.984894 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:58:10 crc kubenswrapper[4713]: I0314 06:58:10.120178 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:58:11 crc kubenswrapper[4713]: I0314 06:58:11.397226 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fdwr" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" containerID="cri-o://96833d8769693dd706b424d5bc519df2328c4f50867c6c17d9588b970021af6c" gracePeriod=2 Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.423652 4713 generic.go:334] "Generic (PLEG): container finished" podID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerID="96833d8769693dd706b424d5bc519df2328c4f50867c6c17d9588b970021af6c" exitCode=0 Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.423728 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerDied","Data":"96833d8769693dd706b424d5bc519df2328c4f50867c6c17d9588b970021af6c"} Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.606321 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.686559 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities\") pod \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.686658 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9t4h\" (UniqueName: \"kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h\") pod \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.686926 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content\") pod \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\" (UID: \"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf\") " Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.687981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities" (OuterVolumeSpecName: "utilities") pod "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" (UID: "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.706044 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h" (OuterVolumeSpecName: "kube-api-access-x9t4h") pod "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" (UID: "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf"). InnerVolumeSpecName "kube-api-access-x9t4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.801539 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.801579 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9t4h\" (UniqueName: \"kubernetes.io/projected/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-kube-api-access-x9t4h\") on node \"crc\" DevicePath \"\"" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.871530 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" (UID: "e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 06:58:12 crc kubenswrapper[4713]: I0314 06:58:12.904351 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.437652 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdwr" event={"ID":"e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf","Type":"ContainerDied","Data":"79bbd413a82610aaee61f2dfadaf7a040c358d6ae505523310e51152a17f3a4d"} Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.437705 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdwr" Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.437720 4713 scope.go:117] "RemoveContainer" containerID="96833d8769693dd706b424d5bc519df2328c4f50867c6c17d9588b970021af6c" Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.468677 4713 scope.go:117] "RemoveContainer" containerID="38fcda0c1b1a9e67b2917bdbf02c82624af795475dbb0d3b6e0c23c258cc4f8c" Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.476246 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.487497 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fdwr"] Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.499983 4713 scope.go:117] "RemoveContainer" containerID="bf71d2552d71c07e1f181167d5f9204719ec7860a8ae443a07f446d89e25491f" Mar 14 06:58:13 crc kubenswrapper[4713]: I0314 06:58:13.576830 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" path="/var/lib/kubelet/pods/e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf/volumes" Mar 14 06:58:14 crc kubenswrapper[4713]: I0314 06:58:14.564096 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:58:14 crc kubenswrapper[4713]: E0314 06:58:14.564569 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:27 crc kubenswrapper[4713]: I0314 06:58:27.571645 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:58:27 crc kubenswrapper[4713]: E0314 06:58:27.572789 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:39 crc kubenswrapper[4713]: I0314 06:58:39.564108 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:58:39 crc kubenswrapper[4713]: E0314 06:58:39.565019 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:53 crc kubenswrapper[4713]: I0314 06:58:53.563814 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:58:53 crc kubenswrapper[4713]: E0314 06:58:53.564654 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:58:57 crc kubenswrapper[4713]: I0314 06:58:57.215457 4713 scope.go:117] "RemoveContainer" containerID="8beb7e79ad21b276b2dd0876c9ff84721fe2dc1996babc27eef7bc0dd5ed1f32" Mar 14 06:59:08 crc kubenswrapper[4713]: I0314 06:59:08.565457 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:59:08 crc kubenswrapper[4713]: E0314 06:59:08.567263 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:59:20 crc kubenswrapper[4713]: I0314 06:59:20.564257 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:59:20 crc kubenswrapper[4713]: E0314 06:59:20.565315 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:59:35 crc kubenswrapper[4713]: I0314 06:59:35.563603 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:59:35 crc kubenswrapper[4713]: E0314 06:59:35.564502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 06:59:46 crc kubenswrapper[4713]: I0314 06:59:46.564699 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 06:59:46 crc kubenswrapper[4713]: E0314 06:59:46.565604 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.146432 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557860-kg8kg"] Mar 14 07:00:00 crc kubenswrapper[4713]: E0314 07:00:00.147592 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9040d059-f826-4f51-a208-f291f5063f00" containerName="oc" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147608 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9040d059-f826-4f51-a208-f291f5063f00" containerName="oc" Mar 14 07:00:00 crc kubenswrapper[4713]: E0314 07:00:00.147631 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="extract-content" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147639 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="extract-content" Mar 14 07:00:00 crc kubenswrapper[4713]: E0314 07:00:00.147651 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="extract-utilities" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147658 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="extract-utilities" Mar 14 07:00:00 crc kubenswrapper[4713]: E0314 07:00:00.147681 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147686 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147921 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9040d059-f826-4f51-a208-f291f5063f00" containerName="oc" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.147933 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36c1f40-c63c-4e0e-8e30-40c8bfd10dbf" containerName="registry-server" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.148824 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.151141 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.151281 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.152124 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.158265 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557860-kg8kg"] Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.202538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbt8j\" (UniqueName: \"kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j\") pod \"auto-csr-approver-29557860-kg8kg\" (UID: \"90261356-badd-47e2-9eb6-b3c4d0a32919\") " pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.244898 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb"] Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.246685 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.252871 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.253299 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.265984 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb"] Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.305054 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.305130 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9cl\" (UniqueName: \"kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.305165 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.305299 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbt8j\" (UniqueName: \"kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j\") pod \"auto-csr-approver-29557860-kg8kg\" (UID: \"90261356-badd-47e2-9eb6-b3c4d0a32919\") " pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.324656 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbt8j\" (UniqueName: \"kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j\") pod \"auto-csr-approver-29557860-kg8kg\" (UID: \"90261356-badd-47e2-9eb6-b3c4d0a32919\") " pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.408384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.408820 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9cl\" (UniqueName: \"kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.408874 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.410010 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.412475 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.427270 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9cl\" (UniqueName: \"kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl\") pod \"collect-profiles-29557860-h66hb\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.469635 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:00 crc kubenswrapper[4713]: I0314 07:00:00.571131 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:01 crc kubenswrapper[4713]: I0314 07:00:01.605355 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 07:00:01 crc kubenswrapper[4713]: E0314 07:00:01.605844 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:00:01 crc kubenswrapper[4713]: I0314 07:00:01.943704 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb"] Mar 14 07:00:01 crc kubenswrapper[4713]: W0314 07:00:01.955196 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6abebe8_6848_4985_9f88_ab26e036aecc.slice/crio-a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6 WatchSource:0}: Error finding container a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6: Status 404 returned error can't find the container with id a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6 Mar 14 07:00:01 crc kubenswrapper[4713]: I0314 07:00:01.967128 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557860-kg8kg"] Mar 14 07:00:02 crc kubenswrapper[4713]: I0314 07:00:02.680704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" event={"ID":"a6abebe8-6848-4985-9f88-ab26e036aecc","Type":"ContainerStarted","Data":"292de4c56be5527615882cea39fa910dbf01f63a7eabdafef19dd4aeb2619b3e"} Mar 14 07:00:02 crc kubenswrapper[4713]: I0314 07:00:02.681015 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" event={"ID":"a6abebe8-6848-4985-9f88-ab26e036aecc","Type":"ContainerStarted","Data":"a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6"} Mar 14 07:00:02 crc kubenswrapper[4713]: I0314 07:00:02.682797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" event={"ID":"90261356-badd-47e2-9eb6-b3c4d0a32919","Type":"ContainerStarted","Data":"c6c7adcfdb691d5b2b03f31b42d0e4cafbe87e775771a1d000251f2cebc03b61"} Mar 14 07:00:02 crc kubenswrapper[4713]: I0314 07:00:02.727140 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" podStartSLOduration=2.727100261 podStartE2EDuration="2.727100261s" podCreationTimestamp="2026-03-14 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:00:02.707465187 +0000 UTC m=+5585.795374487" watchObservedRunningTime="2026-03-14 07:00:02.727100261 +0000 UTC m=+5585.815009561" Mar 14 07:00:03 crc kubenswrapper[4713]: I0314 07:00:03.695486 4713 generic.go:334] "Generic (PLEG): container finished" podID="a6abebe8-6848-4985-9f88-ab26e036aecc" containerID="292de4c56be5527615882cea39fa910dbf01f63a7eabdafef19dd4aeb2619b3e" exitCode=0 Mar 14 07:00:03 crc kubenswrapper[4713]: I0314 07:00:03.695558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" event={"ID":"a6abebe8-6848-4985-9f88-ab26e036aecc","Type":"ContainerDied","Data":"292de4c56be5527615882cea39fa910dbf01f63a7eabdafef19dd4aeb2619b3e"} Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.094292 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.203803 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume\") pod \"a6abebe8-6848-4985-9f88-ab26e036aecc\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.204018 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9cl\" (UniqueName: \"kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl\") pod \"a6abebe8-6848-4985-9f88-ab26e036aecc\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.204176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume\") pod \"a6abebe8-6848-4985-9f88-ab26e036aecc\" (UID: \"a6abebe8-6848-4985-9f88-ab26e036aecc\") " Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.204646 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6abebe8-6848-4985-9f88-ab26e036aecc" (UID: "a6abebe8-6848-4985-9f88-ab26e036aecc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.205496 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6abebe8-6848-4985-9f88-ab26e036aecc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.209836 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl" (OuterVolumeSpecName: "kube-api-access-5j9cl") pod "a6abebe8-6848-4985-9f88-ab26e036aecc" (UID: "a6abebe8-6848-4985-9f88-ab26e036aecc"). InnerVolumeSpecName "kube-api-access-5j9cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.210179 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6abebe8-6848-4985-9f88-ab26e036aecc" (UID: "a6abebe8-6848-4985-9f88-ab26e036aecc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.308350 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9cl\" (UniqueName: \"kubernetes.io/projected/a6abebe8-6848-4985-9f88-ab26e036aecc-kube-api-access-5j9cl\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.308392 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6abebe8-6848-4985-9f88-ab26e036aecc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.721041 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" event={"ID":"a6abebe8-6848-4985-9f88-ab26e036aecc","Type":"ContainerDied","Data":"a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6"} Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.721088 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a66a729081e937fa3adab69d9517c776ae82cef7745fb9ab8fefe4caa00c6cf6" Mar 14 07:00:05 crc kubenswrapper[4713]: I0314 07:00:05.721124 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-h66hb" Mar 14 07:00:06 crc kubenswrapper[4713]: I0314 07:00:06.172981 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h"] Mar 14 07:00:06 crc kubenswrapper[4713]: I0314 07:00:06.189082 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557815-p9p5h"] Mar 14 07:00:07 crc kubenswrapper[4713]: I0314 07:00:07.577595 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf42096c-26c2-4b01-9771-89caa06e1293" path="/var/lib/kubelet/pods/bf42096c-26c2-4b01-9771-89caa06e1293/volumes" Mar 14 07:00:16 crc kubenswrapper[4713]: I0314 07:00:16.564438 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 07:00:16 crc kubenswrapper[4713]: E0314 07:00:16.565799 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:00:20 crc kubenswrapper[4713]: I0314 07:00:20.901420 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" event={"ID":"90261356-badd-47e2-9eb6-b3c4d0a32919","Type":"ContainerStarted","Data":"1ef12ee5236056489832626feccf550f542b7d7240818a29626417a2549550fa"} Mar 14 07:00:20 crc kubenswrapper[4713]: I0314 07:00:20.922967 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" podStartSLOduration=2.410643615 podStartE2EDuration="20.922945197s" podCreationTimestamp="2026-03-14 07:00:00 +0000 UTC" firstStartedPulling="2026-03-14 07:00:01.949649702 +0000 UTC m=+5585.037559002" lastFinishedPulling="2026-03-14 07:00:20.461951284 +0000 UTC m=+5603.549860584" observedRunningTime="2026-03-14 07:00:20.917177437 +0000 UTC m=+5604.005086757" watchObservedRunningTime="2026-03-14 07:00:20.922945197 +0000 UTC m=+5604.010854507" Mar 14 07:00:22 crc kubenswrapper[4713]: I0314 07:00:22.922289 4713 generic.go:334] "Generic (PLEG): container finished" podID="90261356-badd-47e2-9eb6-b3c4d0a32919" containerID="1ef12ee5236056489832626feccf550f542b7d7240818a29626417a2549550fa" exitCode=0 Mar 14 07:00:22 crc kubenswrapper[4713]: I0314 07:00:22.922377 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" event={"ID":"90261356-badd-47e2-9eb6-b3c4d0a32919","Type":"ContainerDied","Data":"1ef12ee5236056489832626feccf550f542b7d7240818a29626417a2549550fa"} Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.324470 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.414883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbt8j\" (UniqueName: \"kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j\") pod \"90261356-badd-47e2-9eb6-b3c4d0a32919\" (UID: \"90261356-badd-47e2-9eb6-b3c4d0a32919\") " Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.423467 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j" (OuterVolumeSpecName: "kube-api-access-vbt8j") pod "90261356-badd-47e2-9eb6-b3c4d0a32919" (UID: "90261356-badd-47e2-9eb6-b3c4d0a32919"). InnerVolumeSpecName "kube-api-access-vbt8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.518325 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbt8j\" (UniqueName: \"kubernetes.io/projected/90261356-badd-47e2-9eb6-b3c4d0a32919-kube-api-access-vbt8j\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.950568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" event={"ID":"90261356-badd-47e2-9eb6-b3c4d0a32919","Type":"ContainerDied","Data":"c6c7adcfdb691d5b2b03f31b42d0e4cafbe87e775771a1d000251f2cebc03b61"} Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.951066 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c7adcfdb691d5b2b03f31b42d0e4cafbe87e775771a1d000251f2cebc03b61" Mar 14 07:00:24 crc kubenswrapper[4713]: I0314 07:00:24.950617 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557860-kg8kg" Mar 14 07:00:25 crc kubenswrapper[4713]: I0314 07:00:25.001878 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557854-5n4zr"] Mar 14 07:00:25 crc kubenswrapper[4713]: I0314 07:00:25.012324 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557854-5n4zr"] Mar 14 07:00:25 crc kubenswrapper[4713]: I0314 07:00:25.577519 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4ad9d9-16f3-4eec-9db5-3338559c4bb0" path="/var/lib/kubelet/pods/6a4ad9d9-16f3-4eec-9db5-3338559c4bb0/volumes" Mar 14 07:00:30 crc kubenswrapper[4713]: I0314 07:00:30.565070 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 07:00:30 crc kubenswrapper[4713]: E0314 07:00:30.566775 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:00:41 crc kubenswrapper[4713]: I0314 07:00:41.564154 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 07:00:42 crc kubenswrapper[4713]: I0314 07:00:42.600289 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e"} Mar 14 07:00:57 crc kubenswrapper[4713]: I0314 07:00:57.352266 4713 scope.go:117] "RemoveContainer" containerID="3806360501eea56d95b268cf9a52a33a41381b4f735e6c11a2cc23e2b4864145" Mar 14 07:00:57 crc kubenswrapper[4713]: I0314 07:00:57.647945 4713 scope.go:117] "RemoveContainer" containerID="78ab633a44b9185e7ad5ff00aab9466bd19f48c55744f7ac689a93c5062a3d99" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.163064 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557861-m89sp"] Mar 14 07:01:00 crc kubenswrapper[4713]: E0314 07:01:00.164372 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6abebe8-6848-4985-9f88-ab26e036aecc" containerName="collect-profiles" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.164393 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6abebe8-6848-4985-9f88-ab26e036aecc" containerName="collect-profiles" Mar 14 07:01:00 crc kubenswrapper[4713]: E0314 07:01:00.164417 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90261356-badd-47e2-9eb6-b3c4d0a32919" containerName="oc" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.164425 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="90261356-badd-47e2-9eb6-b3c4d0a32919" containerName="oc" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.164763 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="90261356-badd-47e2-9eb6-b3c4d0a32919" containerName="oc" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.164809 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6abebe8-6848-4985-9f88-ab26e036aecc" containerName="collect-profiles" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.166160 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.176613 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557861-m89sp"] Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.349155 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.349329 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmgw\" (UniqueName: \"kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.349416 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.349521 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.453700 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.453970 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.454142 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmgw\" (UniqueName: \"kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.454319 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.463506 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.463947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.463960 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.481045 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmgw\" (UniqueName: \"kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw\") pod \"keystone-cron-29557861-m89sp\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:00 crc kubenswrapper[4713]: I0314 07:01:00.489316 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:01 crc kubenswrapper[4713]: I0314 07:01:01.010755 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557861-m89sp"] Mar 14 07:01:01 crc kubenswrapper[4713]: I0314 07:01:01.835366 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557861-m89sp" event={"ID":"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5","Type":"ContainerStarted","Data":"d4efc0170056faa9dbe9501f925157cc8de94ae83c6dfbe789296fdcd351c961"} Mar 14 07:01:01 crc kubenswrapper[4713]: I0314 07:01:01.837037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557861-m89sp" event={"ID":"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5","Type":"ContainerStarted","Data":"9795d98721fb33b889835f34762bfbb2d6c4b98f492cd010804d78c402e4ca07"} Mar 14 07:01:01 crc kubenswrapper[4713]: I0314 07:01:01.861798 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557861-m89sp" podStartSLOduration=1.8617751949999999 podStartE2EDuration="1.861775195s" podCreationTimestamp="2026-03-14 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:01.854433936 +0000 UTC m=+5644.942343226" watchObservedRunningTime="2026-03-14 07:01:01.861775195 +0000 UTC m=+5644.949684495" Mar 14 07:01:06 crc kubenswrapper[4713]: I0314 07:01:06.181273 4713 generic.go:334] "Generic (PLEG): container finished" podID="dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" containerID="d4efc0170056faa9dbe9501f925157cc8de94ae83c6dfbe789296fdcd351c961" exitCode=0 Mar 14 07:01:06 crc kubenswrapper[4713]: I0314 07:01:06.181357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557861-m89sp" event={"ID":"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5","Type":"ContainerDied","Data":"d4efc0170056faa9dbe9501f925157cc8de94ae83c6dfbe789296fdcd351c961"} Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.726340 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.932538 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle\") pod \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.933129 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmgw\" (UniqueName: \"kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw\") pod \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.933315 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data\") pod \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.933391 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys\") pod \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\" (UID: \"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5\") " Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.947050 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" (UID: "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.948262 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw" (OuterVolumeSpecName: "kube-api-access-ccmgw") pod "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" (UID: "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5"). InnerVolumeSpecName "kube-api-access-ccmgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:01:07 crc kubenswrapper[4713]: I0314 07:01:07.976285 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" (UID: "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.001220 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data" (OuterVolumeSpecName: "config-data") pod "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" (UID: "dbd85a78-aca0-4dc4-ba3e-492b3bf749f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.037933 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.037968 4713 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.037982 4713 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.037997 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccmgw\" (UniqueName: \"kubernetes.io/projected/dbd85a78-aca0-4dc4-ba3e-492b3bf749f5-kube-api-access-ccmgw\") on node \"crc\" DevicePath \"\"" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.212275 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557861-m89sp" event={"ID":"dbd85a78-aca0-4dc4-ba3e-492b3bf749f5","Type":"ContainerDied","Data":"9795d98721fb33b889835f34762bfbb2d6c4b98f492cd010804d78c402e4ca07"} Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.212322 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9795d98721fb33b889835f34762bfbb2d6c4b98f492cd010804d78c402e4ca07" Mar 14 07:01:08 crc kubenswrapper[4713]: I0314 07:01:08.212381 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557861-m89sp" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.151662 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557862-7mnk4"] Mar 14 07:02:00 crc kubenswrapper[4713]: E0314 07:02:00.153401 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" containerName="keystone-cron" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.153419 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" containerName="keystone-cron" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.153729 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd85a78-aca0-4dc4-ba3e-492b3bf749f5" containerName="keystone-cron" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.154621 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.157839 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.158366 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.160409 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.164680 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-7mnk4"] Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.266278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trl99\" (UniqueName: \"kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99\") pod \"auto-csr-approver-29557862-7mnk4\" (UID: \"974ecbeb-4410-4c5a-aa81-ea8f55933004\") " pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.368790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trl99\" (UniqueName: \"kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99\") pod \"auto-csr-approver-29557862-7mnk4\" (UID: \"974ecbeb-4410-4c5a-aa81-ea8f55933004\") " pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.408274 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trl99\" (UniqueName: \"kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99\") pod \"auto-csr-approver-29557862-7mnk4\" (UID: \"974ecbeb-4410-4c5a-aa81-ea8f55933004\") " pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:00 crc kubenswrapper[4713]: I0314 07:02:00.484571 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:01 crc kubenswrapper[4713]: I0314 07:02:01.036084 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-7mnk4"] Mar 14 07:02:01 crc kubenswrapper[4713]: I0314 07:02:01.806712 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" event={"ID":"974ecbeb-4410-4c5a-aa81-ea8f55933004","Type":"ContainerStarted","Data":"63681280b4be845a0f2efa34054233a2c3d83694057064764b7f423439f87191"} Mar 14 07:02:02 crc kubenswrapper[4713]: I0314 07:02:02.816600 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" event={"ID":"974ecbeb-4410-4c5a-aa81-ea8f55933004","Type":"ContainerStarted","Data":"a6619d0a287160df5d4d84bedec684f2a9348042bc0ac656f704614b19ef8f06"} Mar 14 07:02:02 crc kubenswrapper[4713]: I0314 07:02:02.830928 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" podStartSLOduration=1.933623555 podStartE2EDuration="2.830898577s" podCreationTimestamp="2026-03-14 07:02:00 +0000 UTC" firstStartedPulling="2026-03-14 07:02:01.038604354 +0000 UTC m=+5704.126513654" lastFinishedPulling="2026-03-14 07:02:01.935879386 +0000 UTC m=+5705.023788676" observedRunningTime="2026-03-14 07:02:02.829780713 +0000 UTC m=+5705.917690003" watchObservedRunningTime="2026-03-14 07:02:02.830898577 +0000 UTC m=+5705.918807907" Mar 14 07:02:03 crc kubenswrapper[4713]: I0314 07:02:03.830462 4713 generic.go:334] "Generic (PLEG): container finished" podID="974ecbeb-4410-4c5a-aa81-ea8f55933004" containerID="a6619d0a287160df5d4d84bedec684f2a9348042bc0ac656f704614b19ef8f06" exitCode=0 Mar 14 07:02:03 crc kubenswrapper[4713]: I0314 07:02:03.830522 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" event={"ID":"974ecbeb-4410-4c5a-aa81-ea8f55933004","Type":"ContainerDied","Data":"a6619d0a287160df5d4d84bedec684f2a9348042bc0ac656f704614b19ef8f06"} Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.301653 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.398625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trl99\" (UniqueName: \"kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99\") pod \"974ecbeb-4410-4c5a-aa81-ea8f55933004\" (UID: \"974ecbeb-4410-4c5a-aa81-ea8f55933004\") " Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.405351 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99" (OuterVolumeSpecName: "kube-api-access-trl99") pod "974ecbeb-4410-4c5a-aa81-ea8f55933004" (UID: "974ecbeb-4410-4c5a-aa81-ea8f55933004"). InnerVolumeSpecName "kube-api-access-trl99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.502251 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trl99\" (UniqueName: \"kubernetes.io/projected/974ecbeb-4410-4c5a-aa81-ea8f55933004-kube-api-access-trl99\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.862246 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" event={"ID":"974ecbeb-4410-4c5a-aa81-ea8f55933004","Type":"ContainerDied","Data":"63681280b4be845a0f2efa34054233a2c3d83694057064764b7f423439f87191"} Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.862555 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63681280b4be845a0f2efa34054233a2c3d83694057064764b7f423439f87191" Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.862276 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-7mnk4" Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.909077 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557856-kx78m"] Mar 14 07:02:05 crc kubenswrapper[4713]: I0314 07:02:05.922223 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557856-kx78m"] Mar 14 07:02:07 crc kubenswrapper[4713]: I0314 07:02:07.577894 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27956939-a5c2-4749-a280-b1ed90490162" path="/var/lib/kubelet/pods/27956939-a5c2-4749-a280-b1ed90490162/volumes" Mar 14 07:02:57 crc kubenswrapper[4713]: I0314 07:02:57.837701 4713 scope.go:117] "RemoveContainer" containerID="79f339aea60b4b139cd236346c0f132c6acf4b515103ce663dd6dd89668bcfb2" Mar 14 07:03:10 crc kubenswrapper[4713]: I0314 07:03:10.731478 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:03:10 crc kubenswrapper[4713]: I0314 07:03:10.732265 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.705701 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:19 crc kubenswrapper[4713]: E0314 07:03:19.707033 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974ecbeb-4410-4c5a-aa81-ea8f55933004" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.707055 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="974ecbeb-4410-4c5a-aa81-ea8f55933004" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.707405 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="974ecbeb-4410-4c5a-aa81-ea8f55933004" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.709527 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.726052 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.771309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsrt\" (UniqueName: \"kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.771834 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.771959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.874423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjsrt\" (UniqueName: \"kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.874617 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.874697 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.875457 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.875468 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:19 crc kubenswrapper[4713]: I0314 07:03:19.897795 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjsrt\" (UniqueName: \"kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt\") pod \"certified-operators-rn6jj\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:20 crc kubenswrapper[4713]: I0314 07:03:20.048197 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:20 crc kubenswrapper[4713]: I0314 07:03:20.731999 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:20 crc kubenswrapper[4713]: I0314 07:03:20.935558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerStarted","Data":"7653d0d913c519254c8fc1390a3f41a8232daa0e13f5459dc528cb3f7439c628"} Mar 14 07:03:21 crc kubenswrapper[4713]: I0314 07:03:21.946158 4713 generic.go:334] "Generic (PLEG): container finished" podID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerID="5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f" exitCode=0 Mar 14 07:03:21 crc kubenswrapper[4713]: I0314 07:03:21.946221 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerDied","Data":"5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f"} Mar 14 07:03:21 crc kubenswrapper[4713]: I0314 07:03:21.948770 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:03:22 crc kubenswrapper[4713]: I0314 07:03:22.960423 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerStarted","Data":"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49"} Mar 14 07:03:24 crc kubenswrapper[4713]: I0314 07:03:24.986849 4713 generic.go:334] "Generic (PLEG): container finished" podID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerID="19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49" exitCode=0 Mar 14 07:03:24 crc kubenswrapper[4713]: I0314 07:03:24.986911 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerDied","Data":"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49"} Mar 14 07:03:26 crc kubenswrapper[4713]: I0314 07:03:26.003239 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerStarted","Data":"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf"} Mar 14 07:03:26 crc kubenswrapper[4713]: I0314 07:03:26.036610 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rn6jj" podStartSLOduration=3.303741053 podStartE2EDuration="7.036588652s" podCreationTimestamp="2026-03-14 07:03:19 +0000 UTC" firstStartedPulling="2026-03-14 07:03:21.948529316 +0000 UTC m=+5785.036438616" lastFinishedPulling="2026-03-14 07:03:25.681376925 +0000 UTC m=+5788.769286215" observedRunningTime="2026-03-14 07:03:26.026849168 +0000 UTC m=+5789.114758468" watchObservedRunningTime="2026-03-14 07:03:26.036588652 +0000 UTC m=+5789.124497952" Mar 14 07:03:30 crc kubenswrapper[4713]: I0314 07:03:30.050025 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:30 crc kubenswrapper[4713]: I0314 07:03:30.050517 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:31 crc kubenswrapper[4713]: I0314 07:03:31.661540 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rn6jj" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="registry-server" probeResult="failure" output=< Mar 14 07:03:31 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:03:31 crc kubenswrapper[4713]: > Mar 14 07:03:40 crc kubenswrapper[4713]: I0314 07:03:40.171417 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:40 crc kubenswrapper[4713]: I0314 07:03:40.234931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:40 crc kubenswrapper[4713]: I0314 07:03:40.414472 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:40 crc kubenswrapper[4713]: I0314 07:03:40.732437 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:03:40 crc kubenswrapper[4713]: I0314 07:03:40.732524 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.212261 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rn6jj" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="registry-server" containerID="cri-o://fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf" gracePeriod=2 Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.815156 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.902305 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content\") pod \"55165af7-0d43-400a-86e5-76ef5a527cb6\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.902677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities\") pod \"55165af7-0d43-400a-86e5-76ef5a527cb6\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.902754 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjsrt\" (UniqueName: \"kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt\") pod \"55165af7-0d43-400a-86e5-76ef5a527cb6\" (UID: \"55165af7-0d43-400a-86e5-76ef5a527cb6\") " Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.903560 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities" (OuterVolumeSpecName: "utilities") pod "55165af7-0d43-400a-86e5-76ef5a527cb6" (UID: "55165af7-0d43-400a-86e5-76ef5a527cb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:41 crc kubenswrapper[4713]: I0314 07:03:41.908758 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt" (OuterVolumeSpecName: "kube-api-access-xjsrt") pod "55165af7-0d43-400a-86e5-76ef5a527cb6" (UID: "55165af7-0d43-400a-86e5-76ef5a527cb6"). InnerVolumeSpecName "kube-api-access-xjsrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.006705 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.006741 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjsrt\" (UniqueName: \"kubernetes.io/projected/55165af7-0d43-400a-86e5-76ef5a527cb6-kube-api-access-xjsrt\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.049793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55165af7-0d43-400a-86e5-76ef5a527cb6" (UID: "55165af7-0d43-400a-86e5-76ef5a527cb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.109000 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55165af7-0d43-400a-86e5-76ef5a527cb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.256221 4713 generic.go:334] "Generic (PLEG): container finished" podID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerID="fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf" exitCode=0 Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.256283 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerDied","Data":"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf"} Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.256315 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn6jj" event={"ID":"55165af7-0d43-400a-86e5-76ef5a527cb6","Type":"ContainerDied","Data":"7653d0d913c519254c8fc1390a3f41a8232daa0e13f5459dc528cb3f7439c628"} Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.256335 4713 scope.go:117] "RemoveContainer" containerID="fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.256531 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn6jj" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.284420 4713 scope.go:117] "RemoveContainer" containerID="19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49" Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.305807 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:42 crc kubenswrapper[4713]: I0314 07:03:42.318348 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rn6jj"] Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.121432 4713 scope.go:117] "RemoveContainer" containerID="5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.222581 4713 scope.go:117] "RemoveContainer" containerID="fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf" Mar 14 07:03:43 crc kubenswrapper[4713]: E0314 07:03:43.237683 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf\": container with ID starting with fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf not found: ID does not exist" containerID="fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.237767 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf"} err="failed to get container status \"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf\": rpc error: code = NotFound desc = could not find container \"fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf\": container with ID starting with fd8f32634e09fcd91453efdc546b321b8b0a592d5b9df31c71e8681f05e791bf not found: ID does not exist" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.237796 4713 scope.go:117] "RemoveContainer" containerID="19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49" Mar 14 07:03:43 crc kubenswrapper[4713]: E0314 07:03:43.239674 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49\": container with ID starting with 19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49 not found: ID does not exist" containerID="19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.239718 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49"} err="failed to get container status \"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49\": rpc error: code = NotFound desc = could not find container \"19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49\": container with ID starting with 19e88ba9632dea0f9ea2040af3fc24914d37eb518bb13c9dbc19ab53700dea49 not found: ID does not exist" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.239737 4713 scope.go:117] "RemoveContainer" containerID="5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f" Mar 14 07:03:43 crc kubenswrapper[4713]: E0314 07:03:43.240096 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f\": container with ID starting with 5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f not found: ID does not exist" containerID="5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.240139 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f"} err="failed to get container status \"5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f\": rpc error: code = NotFound desc = could not find container \"5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f\": container with ID starting with 5c58c2e7077876f1e521922b7e87cd4cb7029c85018bd0a63ebdc31eebddc91f not found: ID does not exist" Mar 14 07:03:43 crc kubenswrapper[4713]: I0314 07:03:43.577107 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" path="/var/lib/kubelet/pods/55165af7-0d43-400a-86e5-76ef5a527cb6/volumes" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.149327 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557864-qpsjp"] Mar 14 07:04:00 crc kubenswrapper[4713]: E0314 07:04:00.150490 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="extract-content" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.150511 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="extract-content" Mar 14 07:04:00 crc kubenswrapper[4713]: E0314 07:04:00.150550 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="registry-server" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.150560 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="registry-server" Mar 14 07:04:00 crc kubenswrapper[4713]: E0314 07:04:00.150603 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="extract-utilities" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.150611 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="extract-utilities" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.150874 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="55165af7-0d43-400a-86e5-76ef5a527cb6" containerName="registry-server" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.152042 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.154961 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.156512 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.156767 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.184416 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-qpsjp"] Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.269004 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqbx\" (UniqueName: \"kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx\") pod \"auto-csr-approver-29557864-qpsjp\" (UID: \"d84c273f-a96b-49e0-aa6f-8c1808965f34\") " pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.372030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqbx\" (UniqueName: \"kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx\") pod \"auto-csr-approver-29557864-qpsjp\" (UID: \"d84c273f-a96b-49e0-aa6f-8c1808965f34\") " pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.391954 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqbx\" (UniqueName: \"kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx\") pod \"auto-csr-approver-29557864-qpsjp\" (UID: \"d84c273f-a96b-49e0-aa6f-8c1808965f34\") " pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:00 crc kubenswrapper[4713]: I0314 07:04:00.490090 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:01 crc kubenswrapper[4713]: I0314 07:04:01.018121 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-qpsjp"] Mar 14 07:04:01 crc kubenswrapper[4713]: I0314 07:04:01.464536 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" event={"ID":"d84c273f-a96b-49e0-aa6f-8c1808965f34","Type":"ContainerStarted","Data":"fc8e3d38d4d544b2bda054a745bb19bbeff01045da5d18a851b881615e31ee67"} Mar 14 07:04:02 crc kubenswrapper[4713]: I0314 07:04:02.479620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" event={"ID":"d84c273f-a96b-49e0-aa6f-8c1808965f34","Type":"ContainerStarted","Data":"36c4f910c4e3a9c6398f0f29d15f2ffd8717565ce2eca9c78ae30e7881f7afa4"} Mar 14 07:04:02 crc kubenswrapper[4713]: I0314 07:04:02.509709 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" podStartSLOduration=1.554547483 podStartE2EDuration="2.509687233s" podCreationTimestamp="2026-03-14 07:04:00 +0000 UTC" firstStartedPulling="2026-03-14 07:04:01.02308173 +0000 UTC m=+5824.110991030" lastFinishedPulling="2026-03-14 07:04:01.97822147 +0000 UTC m=+5825.066130780" observedRunningTime="2026-03-14 07:04:02.496616745 +0000 UTC m=+5825.584526045" watchObservedRunningTime="2026-03-14 07:04:02.509687233 +0000 UTC m=+5825.597596533" Mar 14 07:04:03 crc kubenswrapper[4713]: I0314 07:04:03.503697 4713 generic.go:334] "Generic (PLEG): container finished" podID="d84c273f-a96b-49e0-aa6f-8c1808965f34" containerID="36c4f910c4e3a9c6398f0f29d15f2ffd8717565ce2eca9c78ae30e7881f7afa4" exitCode=0 Mar 14 07:04:03 crc kubenswrapper[4713]: I0314 07:04:03.503760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" event={"ID":"d84c273f-a96b-49e0-aa6f-8c1808965f34","Type":"ContainerDied","Data":"36c4f910c4e3a9c6398f0f29d15f2ffd8717565ce2eca9c78ae30e7881f7afa4"} Mar 14 07:04:04 crc kubenswrapper[4713]: I0314 07:04:04.971421 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.085825 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqbx\" (UniqueName: \"kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx\") pod \"d84c273f-a96b-49e0-aa6f-8c1808965f34\" (UID: \"d84c273f-a96b-49e0-aa6f-8c1808965f34\") " Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.106573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx" (OuterVolumeSpecName: "kube-api-access-lrqbx") pod "d84c273f-a96b-49e0-aa6f-8c1808965f34" (UID: "d84c273f-a96b-49e0-aa6f-8c1808965f34"). InnerVolumeSpecName "kube-api-access-lrqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.188760 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrqbx\" (UniqueName: \"kubernetes.io/projected/d84c273f-a96b-49e0-aa6f-8c1808965f34-kube-api-access-lrqbx\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.543933 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" event={"ID":"d84c273f-a96b-49e0-aa6f-8c1808965f34","Type":"ContainerDied","Data":"fc8e3d38d4d544b2bda054a745bb19bbeff01045da5d18a851b881615e31ee67"} Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.543975 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc8e3d38d4d544b2bda054a745bb19bbeff01045da5d18a851b881615e31ee67" Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.544030 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-qpsjp" Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.596547 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557858-qhtj8"] Mar 14 07:04:05 crc kubenswrapper[4713]: I0314 07:04:05.607834 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557858-qhtj8"] Mar 14 07:04:07 crc kubenswrapper[4713]: I0314 07:04:07.610496 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9040d059-f826-4f51-a208-f291f5063f00" path="/var/lib/kubelet/pods/9040d059-f826-4f51-a208-f291f5063f00/volumes" Mar 14 07:04:10 crc kubenswrapper[4713]: I0314 07:04:10.731342 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:04:10 crc kubenswrapper[4713]: I0314 07:04:10.731608 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:04:10 crc kubenswrapper[4713]: I0314 07:04:10.731651 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 07:04:10 crc kubenswrapper[4713]: I0314 07:04:10.733792 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:04:10 crc kubenswrapper[4713]: I0314 07:04:10.733895 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e" gracePeriod=600 Mar 14 07:04:11 crc kubenswrapper[4713]: I0314 07:04:11.635707 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e" exitCode=0 Mar 14 07:04:11 crc kubenswrapper[4713]: I0314 07:04:11.635777 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e"} Mar 14 07:04:11 crc kubenswrapper[4713]: I0314 07:04:11.636347 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0"} Mar 14 07:04:11 crc kubenswrapper[4713]: I0314 07:04:11.636372 4713 scope.go:117] "RemoveContainer" containerID="771d18ca5f39e3282df6cc8a6c04fe69dd302b4ef72c56cdaae2f952b869aef4" Mar 14 07:04:15 crc kubenswrapper[4713]: I0314 07:04:15.893341 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:15 crc kubenswrapper[4713]: E0314 07:04:15.894985 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84c273f-a96b-49e0-aa6f-8c1808965f34" containerName="oc" Mar 14 07:04:15 crc kubenswrapper[4713]: I0314 07:04:15.895004 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84c273f-a96b-49e0-aa6f-8c1808965f34" containerName="oc" Mar 14 07:04:15 crc kubenswrapper[4713]: I0314 07:04:15.895642 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84c273f-a96b-49e0-aa6f-8c1808965f34" containerName="oc" Mar 14 07:04:15 crc kubenswrapper[4713]: I0314 07:04:15.899705 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:15 crc kubenswrapper[4713]: I0314 07:04:15.925384 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.011622 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.012237 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.012307 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4d2z\" (UniqueName: \"kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.115384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.115468 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.115526 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4d2z\" (UniqueName: \"kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.115886 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.116188 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.136310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4d2z\" (UniqueName: \"kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z\") pod \"community-operators-7m9q2\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.234069 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:16 crc kubenswrapper[4713]: I0314 07:04:16.944850 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:17 crc kubenswrapper[4713]: I0314 07:04:17.725041 4713 generic.go:334] "Generic (PLEG): container finished" podID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerID="fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de" exitCode=0 Mar 14 07:04:17 crc kubenswrapper[4713]: I0314 07:04:17.725237 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerDied","Data":"fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de"} Mar 14 07:04:17 crc kubenswrapper[4713]: I0314 07:04:17.725363 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerStarted","Data":"9a0c3c6495d5e5b8729b1a996a1a45c88a03521ae96cb0365bc09c7e087f356d"} Mar 14 07:04:18 crc kubenswrapper[4713]: I0314 07:04:18.741107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerStarted","Data":"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579"} Mar 14 07:04:22 crc kubenswrapper[4713]: E0314 07:04:22.429218 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc90f32b7_30fc_4e07_b5e9_d03a98a981d5.slice/crio-conmon-514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:04:22 crc kubenswrapper[4713]: I0314 07:04:22.786905 4713 generic.go:334] "Generic (PLEG): container finished" podID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerID="514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579" exitCode=0 Mar 14 07:04:22 crc kubenswrapper[4713]: I0314 07:04:22.786986 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerDied","Data":"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579"} Mar 14 07:04:25 crc kubenswrapper[4713]: I0314 07:04:25.821044 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerStarted","Data":"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269"} Mar 14 07:04:25 crc kubenswrapper[4713]: I0314 07:04:25.850652 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7m9q2" podStartSLOduration=4.10422334 podStartE2EDuration="10.850631107s" podCreationTimestamp="2026-03-14 07:04:15 +0000 UTC" firstStartedPulling="2026-03-14 07:04:17.727308954 +0000 UTC m=+5840.815218254" lastFinishedPulling="2026-03-14 07:04:24.473716721 +0000 UTC m=+5847.561626021" observedRunningTime="2026-03-14 07:04:25.845480617 +0000 UTC m=+5848.933389917" watchObservedRunningTime="2026-03-14 07:04:25.850631107 +0000 UTC m=+5848.938540407" Mar 14 07:04:26 crc kubenswrapper[4713]: I0314 07:04:26.235706 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:26 crc kubenswrapper[4713]: I0314 07:04:26.237532 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:27 crc kubenswrapper[4713]: I0314 07:04:27.290347 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7m9q2" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="registry-server" probeResult="failure" output=< Mar 14 07:04:27 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:04:27 crc kubenswrapper[4713]: > Mar 14 07:04:36 crc kubenswrapper[4713]: I0314 07:04:36.298602 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:36 crc kubenswrapper[4713]: I0314 07:04:36.359092 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:36 crc kubenswrapper[4713]: I0314 07:04:36.537688 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:37 crc kubenswrapper[4713]: I0314 07:04:37.944100 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7m9q2" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="registry-server" containerID="cri-o://d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269" gracePeriod=2 Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.532958 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.699407 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities\") pod \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.699637 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content\") pod \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.699744 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4d2z\" (UniqueName: \"kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z\") pod \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\" (UID: \"c90f32b7-30fc-4e07-b5e9-d03a98a981d5\") " Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.700265 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities" (OuterVolumeSpecName: "utilities") pod "c90f32b7-30fc-4e07-b5e9-d03a98a981d5" (UID: "c90f32b7-30fc-4e07-b5e9-d03a98a981d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.701987 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.705777 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z" (OuterVolumeSpecName: "kube-api-access-p4d2z") pod "c90f32b7-30fc-4e07-b5e9-d03a98a981d5" (UID: "c90f32b7-30fc-4e07-b5e9-d03a98a981d5"). InnerVolumeSpecName "kube-api-access-p4d2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.758726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c90f32b7-30fc-4e07-b5e9-d03a98a981d5" (UID: "c90f32b7-30fc-4e07-b5e9-d03a98a981d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.804342 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.804378 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4d2z\" (UniqueName: \"kubernetes.io/projected/c90f32b7-30fc-4e07-b5e9-d03a98a981d5-kube-api-access-p4d2z\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.954312 4713 generic.go:334] "Generic (PLEG): container finished" podID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerID="d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269" exitCode=0 Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.954350 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerDied","Data":"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269"} Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.954374 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9q2" event={"ID":"c90f32b7-30fc-4e07-b5e9-d03a98a981d5","Type":"ContainerDied","Data":"9a0c3c6495d5e5b8729b1a996a1a45c88a03521ae96cb0365bc09c7e087f356d"} Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.954392 4713 scope.go:117] "RemoveContainer" containerID="d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.954386 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9q2" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.982470 4713 scope.go:117] "RemoveContainer" containerID="514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579" Mar 14 07:04:38 crc kubenswrapper[4713]: I0314 07:04:38.994338 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.007761 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7m9q2"] Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.525828 4713 scope.go:117] "RemoveContainer" containerID="fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.583333 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" path="/var/lib/kubelet/pods/c90f32b7-30fc-4e07-b5e9-d03a98a981d5/volumes" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.626612 4713 scope.go:117] "RemoveContainer" containerID="d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269" Mar 14 07:04:39 crc kubenswrapper[4713]: E0314 07:04:39.627080 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269\": container with ID starting with d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269 not found: ID does not exist" containerID="d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.627121 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269"} err="failed to get container status \"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269\": rpc error: code = NotFound desc = could not find container \"d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269\": container with ID starting with d6ea1b4cdd7d0c6b550556223283da0d230b5df8b1a6e745393fec971a81d269 not found: ID does not exist" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.627149 4713 scope.go:117] "RemoveContainer" containerID="514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579" Mar 14 07:04:39 crc kubenswrapper[4713]: E0314 07:04:39.627458 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579\": container with ID starting with 514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579 not found: ID does not exist" containerID="514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.627493 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579"} err="failed to get container status \"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579\": rpc error: code = NotFound desc = could not find container \"514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579\": container with ID starting with 514553ddad4ad01bba9c53956c10e87814d659fd5c45ff9de242082c65808579 not found: ID does not exist" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.627522 4713 scope.go:117] "RemoveContainer" containerID="fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de" Mar 14 07:04:39 crc kubenswrapper[4713]: E0314 07:04:39.627935 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de\": container with ID starting with fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de not found: ID does not exist" containerID="fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de" Mar 14 07:04:39 crc kubenswrapper[4713]: I0314 07:04:39.627963 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de"} err="failed to get container status \"fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de\": rpc error: code = NotFound desc = could not find container \"fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de\": container with ID starting with fad154f169fa0e375c71b591025485bd8a6fec18027c21404760bece1f9c02de not found: ID does not exist" Mar 14 07:04:58 crc kubenswrapper[4713]: I0314 07:04:58.087820 4713 scope.go:117] "RemoveContainer" containerID="a3a22fa53360b13bc01788cdbf51588c87b40635ddc9aef83cd69a1f7a32f02a" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.171334 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557866-dmt55"] Mar 14 07:06:00 crc kubenswrapper[4713]: E0314 07:06:00.172430 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="extract-content" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.172444 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="extract-content" Mar 14 07:06:00 crc kubenswrapper[4713]: E0314 07:06:00.172463 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="extract-utilities" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.172470 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="extract-utilities" Mar 14 07:06:00 crc kubenswrapper[4713]: E0314 07:06:00.172488 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="registry-server" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.172513 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="registry-server" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.172805 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90f32b7-30fc-4e07-b5e9-d03a98a981d5" containerName="registry-server" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.174688 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.177980 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.178678 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.181466 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.191888 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-dmt55"] Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.254356 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bqx\" (UniqueName: \"kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx\") pod \"auto-csr-approver-29557866-dmt55\" (UID: \"50b57c93-ee45-43fa-b26d-5da0a6342431\") " pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.592510 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bqx\" (UniqueName: \"kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx\") pod \"auto-csr-approver-29557866-dmt55\" (UID: \"50b57c93-ee45-43fa-b26d-5da0a6342431\") " pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.623369 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bqx\" (UniqueName: \"kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx\") pod \"auto-csr-approver-29557866-dmt55\" (UID: \"50b57c93-ee45-43fa-b26d-5da0a6342431\") " pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:00 crc kubenswrapper[4713]: I0314 07:06:00.799413 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:01 crc kubenswrapper[4713]: I0314 07:06:01.796847 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-dmt55"] Mar 14 07:06:02 crc kubenswrapper[4713]: I0314 07:06:02.420854 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-dmt55" event={"ID":"50b57c93-ee45-43fa-b26d-5da0a6342431","Type":"ContainerStarted","Data":"764cee27a013a77e922edc99471ade3ccf73f7cbfde30f5cc66fb3678874c110"} Mar 14 07:06:03 crc kubenswrapper[4713]: I0314 07:06:03.434241 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-dmt55" event={"ID":"50b57c93-ee45-43fa-b26d-5da0a6342431","Type":"ContainerStarted","Data":"79be3d679d4460e53c6b426360b3562b94c3321d3ff25a0cf794bbbe316c218d"} Mar 14 07:06:03 crc kubenswrapper[4713]: I0314 07:06:03.452420 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557866-dmt55" podStartSLOduration=2.411753353 podStartE2EDuration="3.452396306s" podCreationTimestamp="2026-03-14 07:06:00 +0000 UTC" firstStartedPulling="2026-03-14 07:06:01.801532969 +0000 UTC m=+5944.889442269" lastFinishedPulling="2026-03-14 07:06:02.842175922 +0000 UTC m=+5945.930085222" observedRunningTime="2026-03-14 07:06:03.452362875 +0000 UTC m=+5946.540272185" watchObservedRunningTime="2026-03-14 07:06:03.452396306 +0000 UTC m=+5946.540305616" Mar 14 07:06:04 crc kubenswrapper[4713]: I0314 07:06:04.446121 4713 generic.go:334] "Generic (PLEG): container finished" podID="50b57c93-ee45-43fa-b26d-5da0a6342431" containerID="79be3d679d4460e53c6b426360b3562b94c3321d3ff25a0cf794bbbe316c218d" exitCode=0 Mar 14 07:06:04 crc kubenswrapper[4713]: I0314 07:06:04.446166 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-dmt55" event={"ID":"50b57c93-ee45-43fa-b26d-5da0a6342431","Type":"ContainerDied","Data":"79be3d679d4460e53c6b426360b3562b94c3321d3ff25a0cf794bbbe316c218d"} Mar 14 07:06:05 crc kubenswrapper[4713]: I0314 07:06:05.913190 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:05 crc kubenswrapper[4713]: I0314 07:06:05.935569 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bqx\" (UniqueName: \"kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx\") pod \"50b57c93-ee45-43fa-b26d-5da0a6342431\" (UID: \"50b57c93-ee45-43fa-b26d-5da0a6342431\") " Mar 14 07:06:05 crc kubenswrapper[4713]: I0314 07:06:05.993297 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx" (OuterVolumeSpecName: "kube-api-access-49bqx") pod "50b57c93-ee45-43fa-b26d-5da0a6342431" (UID: "50b57c93-ee45-43fa-b26d-5da0a6342431"). InnerVolumeSpecName "kube-api-access-49bqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.039409 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bqx\" (UniqueName: \"kubernetes.io/projected/50b57c93-ee45-43fa-b26d-5da0a6342431-kube-api-access-49bqx\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.476290 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-dmt55" event={"ID":"50b57c93-ee45-43fa-b26d-5da0a6342431","Type":"ContainerDied","Data":"764cee27a013a77e922edc99471ade3ccf73f7cbfde30f5cc66fb3678874c110"} Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.476758 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="764cee27a013a77e922edc99471ade3ccf73f7cbfde30f5cc66fb3678874c110" Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.476345 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-dmt55" Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.544384 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557860-kg8kg"] Mar 14 07:06:06 crc kubenswrapper[4713]: I0314 07:06:06.555847 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557860-kg8kg"] Mar 14 07:06:07 crc kubenswrapper[4713]: I0314 07:06:07.579185 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90261356-badd-47e2-9eb6-b3c4d0a32919" path="/var/lib/kubelet/pods/90261356-badd-47e2-9eb6-b3c4d0a32919/volumes" Mar 14 07:06:38 crc kubenswrapper[4713]: I0314 07:06:38.848774 4713 generic.go:334] "Generic (PLEG): container finished" podID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" containerID="61b652ab071513c05275ebbf158c68291d61e3a0f6f2d0b3998f967f2f12ca4b" exitCode=1 Mar 14 07:06:38 crc kubenswrapper[4713]: I0314 07:06:38.848851 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1","Type":"ContainerDied","Data":"61b652ab071513c05275ebbf158c68291d61e3a0f6f2d0b3998f967f2f12ca4b"} Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.445749 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.524356 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.524938 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mndjs\" (UniqueName: \"kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.524979 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525006 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525041 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525436 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525487 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.525629 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs\") pod \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\" (UID: \"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1\") " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.526122 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data" (OuterVolumeSpecName: "config-data") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.526612 4713 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.527667 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.540791 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs" (OuterVolumeSpecName: "kube-api-access-mndjs") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "kube-api-access-mndjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.541510 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.549708 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.574385 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.584624 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.588093 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.613826 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" (UID: "0236ca7c-fd1b-42f0-805c-8d53e34a3cc1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.629923 4713 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.630275 4713 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.630389 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mndjs\" (UniqueName: \"kubernetes.io/projected/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-kube-api-access-mndjs\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.630467 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.631058 4713 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.631155 4713 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.631349 4713 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.631420 4713 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0236ca7c-fd1b-42f0-805c-8d53e34a3cc1-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.671362 4713 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.731876 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.731950 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.733577 4713 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.880876 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0236ca7c-fd1b-42f0-805c-8d53e34a3cc1","Type":"ContainerDied","Data":"7d5dc183b321eec63770f1de1041061158c3593d87bb3dca8b2b017e0bdd6342"} Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.881147 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d5dc183b321eec63770f1de1041061158c3593d87bb3dca8b2b017e0bdd6342" Mar 14 07:06:40 crc kubenswrapper[4713]: I0314 07:06:40.881038 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.931848 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 07:06:51 crc kubenswrapper[4713]: E0314 07:06:51.933946 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" containerName="tempest-tests-tempest-tests-runner" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.933978 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" containerName="tempest-tests-tempest-tests-runner" Mar 14 07:06:51 crc kubenswrapper[4713]: E0314 07:06:51.934051 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b57c93-ee45-43fa-b26d-5da0a6342431" containerName="oc" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.934063 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b57c93-ee45-43fa-b26d-5da0a6342431" containerName="oc" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.934471 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b57c93-ee45-43fa-b26d-5da0a6342431" containerName="oc" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.934495 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0236ca7c-fd1b-42f0-805c-8d53e34a3cc1" containerName="tempest-tests-tempest-tests-runner" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.936080 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.938944 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qcv76" Mar 14 07:06:51 crc kubenswrapper[4713]: I0314 07:06:51.976603 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.051073 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zr8\" (UniqueName: \"kubernetes.io/projected/1a497ede-a36f-4e68-a3d0-9998e7c4851b-kube-api-access-j7zr8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.051133 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.156740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zr8\" (UniqueName: \"kubernetes.io/projected/1a497ede-a36f-4e68-a3d0-9998e7c4851b-kube-api-access-j7zr8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.156859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.160660 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.224344 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zr8\" (UniqueName: \"kubernetes.io/projected/1a497ede-a36f-4e68-a3d0-9998e7c4851b-kube-api-access-j7zr8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.265583 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1a497ede-a36f-4e68-a3d0-9998e7c4851b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:52 crc kubenswrapper[4713]: I0314 07:06:52.563043 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 07:06:53 crc kubenswrapper[4713]: I0314 07:06:53.049544 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 07:06:54 crc kubenswrapper[4713]: I0314 07:06:54.035156 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1a497ede-a36f-4e68-a3d0-9998e7c4851b","Type":"ContainerStarted","Data":"229830d76eb19cd590c5e073acb933534dcad143c5b66c8b89a65379d3e96dbb"} Mar 14 07:06:56 crc kubenswrapper[4713]: I0314 07:06:56.060287 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1a497ede-a36f-4e68-a3d0-9998e7c4851b","Type":"ContainerStarted","Data":"72c287f85b2924c25b33011bdf1584befd00e24f4b5e8b729cb0e07642b10a99"} Mar 14 07:06:56 crc kubenswrapper[4713]: I0314 07:06:56.077114 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.52767405 podStartE2EDuration="5.077088717s" podCreationTimestamp="2026-03-14 07:06:51 +0000 UTC" firstStartedPulling="2026-03-14 07:06:53.051847503 +0000 UTC m=+5996.139756803" lastFinishedPulling="2026-03-14 07:06:55.60126217 +0000 UTC m=+5998.689171470" observedRunningTime="2026-03-14 07:06:56.072475242 +0000 UTC m=+5999.160384542" watchObservedRunningTime="2026-03-14 07:06:56.077088717 +0000 UTC m=+5999.164998027" Mar 14 07:06:58 crc kubenswrapper[4713]: I0314 07:06:58.229134 4713 scope.go:117] "RemoveContainer" containerID="1ef12ee5236056489832626feccf550f542b7d7240818a29626417a2549550fa" Mar 14 07:07:10 crc kubenswrapper[4713]: I0314 07:07:10.731723 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:07:10 crc kubenswrapper[4713]: I0314 07:07:10.733832 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.294037 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfn2/must-gather-shpdg"] Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.301351 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.304496 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8mfn2"/"default-dockercfg-x6chn" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.308732 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8mfn2"/"openshift-service-ca.crt" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.315555 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8mfn2"/"kube-root-ca.crt" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.436021 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpgm\" (UniqueName: \"kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.443357 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.483505 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfn2/must-gather-shpdg"] Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.556984 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpgm\" (UniqueName: \"kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.557158 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.557699 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.598110 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpgm\" (UniqueName: \"kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm\") pod \"must-gather-shpdg\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.622885 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.676919 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.682300 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.767417 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.783232 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sgbt\" (UniqueName: \"kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.783531 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.783887 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.887789 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sgbt\" (UniqueName: \"kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.887961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.888152 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.889039 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.889089 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:34 crc kubenswrapper[4713]: I0314 07:07:34.924643 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sgbt\" (UniqueName: \"kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt\") pod \"redhat-operators-9q79z\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:35 crc kubenswrapper[4713]: I0314 07:07:35.217957 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:35 crc kubenswrapper[4713]: I0314 07:07:35.399174 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfn2/must-gather-shpdg"] Mar 14 07:07:35 crc kubenswrapper[4713]: I0314 07:07:35.611630 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/must-gather-shpdg" event={"ID":"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b","Type":"ContainerStarted","Data":"20efb1b17979e0b76e5477b01a7a0590222b9b6f256330285679b56ed1f61ae9"} Mar 14 07:07:35 crc kubenswrapper[4713]: W0314 07:07:35.742888 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod762bc0d1_afca_4c67_9336_2855e56cb9e4.slice/crio-56a0bb70535e4b67fe05a12123b9b9d441d527c19a530a7f0def4fa4dadf5a76 WatchSource:0}: Error finding container 56a0bb70535e4b67fe05a12123b9b9d441d527c19a530a7f0def4fa4dadf5a76: Status 404 returned error can't find the container with id 56a0bb70535e4b67fe05a12123b9b9d441d527c19a530a7f0def4fa4dadf5a76 Mar 14 07:07:35 crc kubenswrapper[4713]: I0314 07:07:35.743746 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:07:36 crc kubenswrapper[4713]: I0314 07:07:36.629363 4713 generic.go:334] "Generic (PLEG): container finished" podID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerID="de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0" exitCode=0 Mar 14 07:07:36 crc kubenswrapper[4713]: I0314 07:07:36.629475 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerDied","Data":"de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0"} Mar 14 07:07:36 crc kubenswrapper[4713]: I0314 07:07:36.629665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerStarted","Data":"56a0bb70535e4b67fe05a12123b9b9d441d527c19a530a7f0def4fa4dadf5a76"} Mar 14 07:07:38 crc kubenswrapper[4713]: I0314 07:07:38.679859 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerStarted","Data":"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0"} Mar 14 07:07:40 crc kubenswrapper[4713]: I0314 07:07:40.731353 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:07:40 crc kubenswrapper[4713]: I0314 07:07:40.731994 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:07:40 crc kubenswrapper[4713]: I0314 07:07:40.732056 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 07:07:40 crc kubenswrapper[4713]: I0314 07:07:40.734480 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:07:40 crc kubenswrapper[4713]: I0314 07:07:40.734551 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" gracePeriod=600 Mar 14 07:07:41 crc kubenswrapper[4713]: I0314 07:07:41.717563 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" exitCode=0 Mar 14 07:07:41 crc kubenswrapper[4713]: I0314 07:07:41.717620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0"} Mar 14 07:07:41 crc kubenswrapper[4713]: I0314 07:07:41.717909 4713 scope.go:117] "RemoveContainer" containerID="79c0fd346a704a8b6683d885db93825cd1c533c532906608113e5943d1d5133e" Mar 14 07:07:43 crc kubenswrapper[4713]: E0314 07:07:43.948656 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:07:44 crc kubenswrapper[4713]: I0314 07:07:44.768410 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:07:44 crc kubenswrapper[4713]: E0314 07:07:44.769392 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:07:48 crc kubenswrapper[4713]: I0314 07:07:48.843370 4713 generic.go:334] "Generic (PLEG): container finished" podID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerID="1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0" exitCode=0 Mar 14 07:07:48 crc kubenswrapper[4713]: I0314 07:07:48.843449 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerDied","Data":"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0"} Mar 14 07:07:49 crc kubenswrapper[4713]: I0314 07:07:49.871573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/must-gather-shpdg" event={"ID":"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b","Type":"ContainerStarted","Data":"10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335"} Mar 14 07:07:49 crc kubenswrapper[4713]: I0314 07:07:49.872037 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/must-gather-shpdg" event={"ID":"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b","Type":"ContainerStarted","Data":"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3"} Mar 14 07:07:49 crc kubenswrapper[4713]: I0314 07:07:49.874581 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerStarted","Data":"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3"} Mar 14 07:07:49 crc kubenswrapper[4713]: I0314 07:07:49.899247 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfn2/must-gather-shpdg" podStartSLOduration=2.5565171209999997 podStartE2EDuration="15.899197025s" podCreationTimestamp="2026-03-14 07:07:34 +0000 UTC" firstStartedPulling="2026-03-14 07:07:35.455455983 +0000 UTC m=+6038.543365273" lastFinishedPulling="2026-03-14 07:07:48.798135877 +0000 UTC m=+6051.886045177" observedRunningTime="2026-03-14 07:07:49.89068764 +0000 UTC m=+6052.978596950" watchObservedRunningTime="2026-03-14 07:07:49.899197025 +0000 UTC m=+6052.987106325" Mar 14 07:07:49 crc kubenswrapper[4713]: I0314 07:07:49.920828 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9q79z" podStartSLOduration=3.28402499 podStartE2EDuration="15.920787871s" podCreationTimestamp="2026-03-14 07:07:34 +0000 UTC" firstStartedPulling="2026-03-14 07:07:36.631990269 +0000 UTC m=+6039.719899569" lastFinishedPulling="2026-03-14 07:07:49.26875315 +0000 UTC m=+6052.356662450" observedRunningTime="2026-03-14 07:07:49.913228324 +0000 UTC m=+6053.001137624" watchObservedRunningTime="2026-03-14 07:07:49.920787871 +0000 UTC m=+6053.008697161" Mar 14 07:07:55 crc kubenswrapper[4713]: I0314 07:07:55.218275 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:55 crc kubenswrapper[4713]: I0314 07:07:55.219013 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:07:56 crc kubenswrapper[4713]: I0314 07:07:56.273628 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:07:56 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:07:56 crc kubenswrapper[4713]: > Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.038114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-stc2l"] Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.040440 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.190062 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.190142 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrh5\" (UniqueName: \"kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.293417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.293500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrh5\" (UniqueName: \"kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.293640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.312650 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrh5\" (UniqueName: \"kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5\") pod \"crc-debug-stc2l\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.361759 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:07:57 crc kubenswrapper[4713]: W0314 07:07:57.426122 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26621cc1_fd4e_463b_aea6_65b2d9c4c85d.slice/crio-c2823f817535ad24b511b6ab0c1141da8334013c1faa9520f7f2885b6f80a0c8 WatchSource:0}: Error finding container c2823f817535ad24b511b6ab0c1141da8334013c1faa9520f7f2885b6f80a0c8: Status 404 returned error can't find the container with id c2823f817535ad24b511b6ab0c1141da8334013c1faa9520f7f2885b6f80a0c8 Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.572745 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:07:57 crc kubenswrapper[4713]: E0314 07:07:57.573307 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:07:57 crc kubenswrapper[4713]: I0314 07:07:57.960772 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" event={"ID":"26621cc1-fd4e-463b-aea6-65b2d9c4c85d","Type":"ContainerStarted","Data":"c2823f817535ad24b511b6ab0c1141da8334013c1faa9520f7f2885b6f80a0c8"} Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.165228 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557868-g2hff"] Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.169050 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.171160 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.171382 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.181627 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.183403 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-g2hff"] Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.284164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccss7\" (UniqueName: \"kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7\") pod \"auto-csr-approver-29557868-g2hff\" (UID: \"c3d08e70-3946-4a9a-9c14-1078698c2383\") " pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.386997 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccss7\" (UniqueName: \"kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7\") pod \"auto-csr-approver-29557868-g2hff\" (UID: \"c3d08e70-3946-4a9a-9c14-1078698c2383\") " pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.410157 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccss7\" (UniqueName: \"kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7\") pod \"auto-csr-approver-29557868-g2hff\" (UID: \"c3d08e70-3946-4a9a-9c14-1078698c2383\") " pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:00 crc kubenswrapper[4713]: I0314 07:08:00.496138 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:01 crc kubenswrapper[4713]: I0314 07:08:01.031687 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-g2hff"] Mar 14 07:08:01 crc kubenswrapper[4713]: W0314 07:08:01.033469 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3d08e70_3946_4a9a_9c14_1078698c2383.slice/crio-f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361 WatchSource:0}: Error finding container f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361: Status 404 returned error can't find the container with id f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361 Mar 14 07:08:02 crc kubenswrapper[4713]: I0314 07:08:02.006172 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-g2hff" event={"ID":"c3d08e70-3946-4a9a-9c14-1078698c2383","Type":"ContainerStarted","Data":"f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361"} Mar 14 07:08:04 crc kubenswrapper[4713]: I0314 07:08:04.076180 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-g2hff" event={"ID":"c3d08e70-3946-4a9a-9c14-1078698c2383","Type":"ContainerStarted","Data":"fee625f7ef359ca7f1c872067695ff965ad4ab9c615e972fc59d390d642d1d5c"} Mar 14 07:08:04 crc kubenswrapper[4713]: I0314 07:08:04.094302 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557868-g2hff" podStartSLOduration=2.236176871 podStartE2EDuration="4.09428416s" podCreationTimestamp="2026-03-14 07:08:00 +0000 UTC" firstStartedPulling="2026-03-14 07:08:01.036661256 +0000 UTC m=+6064.124570556" lastFinishedPulling="2026-03-14 07:08:02.894768545 +0000 UTC m=+6065.982677845" observedRunningTime="2026-03-14 07:08:04.093484235 +0000 UTC m=+6067.181393555" watchObservedRunningTime="2026-03-14 07:08:04.09428416 +0000 UTC m=+6067.182193460" Mar 14 07:08:06 crc kubenswrapper[4713]: I0314 07:08:06.105172 4713 generic.go:334] "Generic (PLEG): container finished" podID="c3d08e70-3946-4a9a-9c14-1078698c2383" containerID="fee625f7ef359ca7f1c872067695ff965ad4ab9c615e972fc59d390d642d1d5c" exitCode=0 Mar 14 07:08:06 crc kubenswrapper[4713]: I0314 07:08:06.105268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-g2hff" event={"ID":"c3d08e70-3946-4a9a-9c14-1078698c2383","Type":"ContainerDied","Data":"fee625f7ef359ca7f1c872067695ff965ad4ab9c615e972fc59d390d642d1d5c"} Mar 14 07:08:06 crc kubenswrapper[4713]: I0314 07:08:06.276492 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:06 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:06 crc kubenswrapper[4713]: > Mar 14 07:08:12 crc kubenswrapper[4713]: I0314 07:08:12.565173 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:08:12 crc kubenswrapper[4713]: E0314 07:08:12.566143 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:08:16 crc kubenswrapper[4713]: I0314 07:08:16.287190 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:16 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:16 crc kubenswrapper[4713]: > Mar 14 07:08:16 crc kubenswrapper[4713]: I0314 07:08:16.355350 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:16 crc kubenswrapper[4713]: E0314 07:08:16.362763 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 14 07:08:16 crc kubenswrapper[4713]: E0314 07:08:16.364326 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcrh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-stc2l_openshift-must-gather-8mfn2(26621cc1-fd4e-463b-aea6-65b2d9c4c85d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:08:16 crc kubenswrapper[4713]: E0314 07:08:16.365689 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" Mar 14 07:08:16 crc kubenswrapper[4713]: I0314 07:08:16.467711 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccss7\" (UniqueName: \"kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7\") pod \"c3d08e70-3946-4a9a-9c14-1078698c2383\" (UID: \"c3d08e70-3946-4a9a-9c14-1078698c2383\") " Mar 14 07:08:16 crc kubenswrapper[4713]: I0314 07:08:16.480875 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7" (OuterVolumeSpecName: "kube-api-access-ccss7") pod "c3d08e70-3946-4a9a-9c14-1078698c2383" (UID: "c3d08e70-3946-4a9a-9c14-1078698c2383"). InnerVolumeSpecName "kube-api-access-ccss7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:08:16 crc kubenswrapper[4713]: I0314 07:08:16.572178 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccss7\" (UniqueName: \"kubernetes.io/projected/c3d08e70-3946-4a9a-9c14-1078698c2383-kube-api-access-ccss7\") on node \"crc\" DevicePath \"\"" Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.236064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-g2hff" event={"ID":"c3d08e70-3946-4a9a-9c14-1078698c2383","Type":"ContainerDied","Data":"f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361"} Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.236105 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-g2hff" Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.236129 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f089a6406a55a931ae7124865b798c8241cf681e501b9cb2d3d30868f9a99361" Mar 14 07:08:17 crc kubenswrapper[4713]: E0314 07:08:17.239578 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.502307 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-7mnk4"] Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.535618 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-7mnk4"] Mar 14 07:08:17 crc kubenswrapper[4713]: I0314 07:08:17.591092 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974ecbeb-4410-4c5a-aa81-ea8f55933004" path="/var/lib/kubelet/pods/974ecbeb-4410-4c5a-aa81-ea8f55933004/volumes" Mar 14 07:08:26 crc kubenswrapper[4713]: I0314 07:08:26.279314 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:26 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:26 crc kubenswrapper[4713]: > Mar 14 07:08:27 crc kubenswrapper[4713]: I0314 07:08:27.573558 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:08:27 crc kubenswrapper[4713]: E0314 07:08:27.574416 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:08:29 crc kubenswrapper[4713]: I0314 07:08:29.567465 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:08:30 crc kubenswrapper[4713]: I0314 07:08:30.420868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" event={"ID":"26621cc1-fd4e-463b-aea6-65b2d9c4c85d","Type":"ContainerStarted","Data":"defd450a42c66f6869b13ecc59759cd924433325ebb78b7a4edc28f026a44ce9"} Mar 14 07:08:30 crc kubenswrapper[4713]: I0314 07:08:30.449404 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" podStartSLOduration=0.886267954 podStartE2EDuration="33.449373561s" podCreationTimestamp="2026-03-14 07:07:57 +0000 UTC" firstStartedPulling="2026-03-14 07:07:57.428853593 +0000 UTC m=+6060.516762893" lastFinishedPulling="2026-03-14 07:08:29.99195919 +0000 UTC m=+6093.079868500" observedRunningTime="2026-03-14 07:08:30.43363398 +0000 UTC m=+6093.521543280" watchObservedRunningTime="2026-03-14 07:08:30.449373561 +0000 UTC m=+6093.537282881" Mar 14 07:08:36 crc kubenswrapper[4713]: I0314 07:08:36.300063 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:36 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:36 crc kubenswrapper[4713]: > Mar 14 07:08:42 crc kubenswrapper[4713]: I0314 07:08:42.563857 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:08:42 crc kubenswrapper[4713]: E0314 07:08:42.564876 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:08:46 crc kubenswrapper[4713]: I0314 07:08:46.281979 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:46 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:46 crc kubenswrapper[4713]: > Mar 14 07:08:55 crc kubenswrapper[4713]: I0314 07:08:55.564742 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:08:55 crc kubenswrapper[4713]: E0314 07:08:55.565530 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:08:56 crc kubenswrapper[4713]: I0314 07:08:56.272974 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" probeResult="failure" output=< Mar 14 07:08:56 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:08:56 crc kubenswrapper[4713]: > Mar 14 07:08:58 crc kubenswrapper[4713]: I0314 07:08:58.422866 4713 scope.go:117] "RemoveContainer" containerID="a6619d0a287160df5d4d84bedec684f2a9348042bc0ac656f704614b19ef8f06" Mar 14 07:09:05 crc kubenswrapper[4713]: I0314 07:09:05.286381 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:09:05 crc kubenswrapper[4713]: I0314 07:09:05.348079 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:09:05 crc kubenswrapper[4713]: I0314 07:09:05.902668 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:09:06 crc kubenswrapper[4713]: I0314 07:09:06.564073 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:09:06 crc kubenswrapper[4713]: E0314 07:09:06.564466 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:09:06 crc kubenswrapper[4713]: I0314 07:09:06.925807 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9q79z" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" containerID="cri-o://759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3" gracePeriod=2 Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.543266 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.713055 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities\") pod \"762bc0d1-afca-4c67-9336-2855e56cb9e4\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.713273 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sgbt\" (UniqueName: \"kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt\") pod \"762bc0d1-afca-4c67-9336-2855e56cb9e4\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.713466 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content\") pod \"762bc0d1-afca-4c67-9336-2855e56cb9e4\" (UID: \"762bc0d1-afca-4c67-9336-2855e56cb9e4\") " Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.715002 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities" (OuterVolumeSpecName: "utilities") pod "762bc0d1-afca-4c67-9336-2855e56cb9e4" (UID: "762bc0d1-afca-4c67-9336-2855e56cb9e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.716132 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.725113 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt" (OuterVolumeSpecName: "kube-api-access-6sgbt") pod "762bc0d1-afca-4c67-9336-2855e56cb9e4" (UID: "762bc0d1-afca-4c67-9336-2855e56cb9e4"). InnerVolumeSpecName "kube-api-access-6sgbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.819121 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sgbt\" (UniqueName: \"kubernetes.io/projected/762bc0d1-afca-4c67-9336-2855e56cb9e4-kube-api-access-6sgbt\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.879534 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "762bc0d1-afca-4c67-9336-2855e56cb9e4" (UID: "762bc0d1-afca-4c67-9336-2855e56cb9e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.921561 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762bc0d1-afca-4c67-9336-2855e56cb9e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.952810 4713 generic.go:334] "Generic (PLEG): container finished" podID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerID="759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3" exitCode=0 Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.952858 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9q79z" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.952852 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerDied","Data":"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3"} Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.952973 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9q79z" event={"ID":"762bc0d1-afca-4c67-9336-2855e56cb9e4","Type":"ContainerDied","Data":"56a0bb70535e4b67fe05a12123b9b9d441d527c19a530a7f0def4fa4dadf5a76"} Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.952990 4713 scope.go:117] "RemoveContainer" containerID="759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3" Mar 14 07:09:07 crc kubenswrapper[4713]: I0314 07:09:07.974227 4713 scope.go:117] "RemoveContainer" containerID="1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.004272 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.023092 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9q79z"] Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.023673 4713 scope.go:117] "RemoveContainer" containerID="de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.062030 4713 scope.go:117] "RemoveContainer" containerID="759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3" Mar 14 07:09:08 crc kubenswrapper[4713]: E0314 07:09:08.063115 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3\": container with ID starting with 759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3 not found: ID does not exist" containerID="759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.063158 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3"} err="failed to get container status \"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3\": rpc error: code = NotFound desc = could not find container \"759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3\": container with ID starting with 759953ed6ff032bd037617d019163fe501e650a05319b1dd0002f0feceb101e3 not found: ID does not exist" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.063186 4713 scope.go:117] "RemoveContainer" containerID="1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0" Mar 14 07:09:08 crc kubenswrapper[4713]: E0314 07:09:08.063446 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0\": container with ID starting with 1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0 not found: ID does not exist" containerID="1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.063469 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0"} err="failed to get container status \"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0\": rpc error: code = NotFound desc = could not find container \"1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0\": container with ID starting with 1df7a7655927e57c1c618734b4ee83b4962cb32ea4bfd365e876fc73ed9b6da0 not found: ID does not exist" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.063482 4713 scope.go:117] "RemoveContainer" containerID="de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0" Mar 14 07:09:08 crc kubenswrapper[4713]: E0314 07:09:08.063699 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0\": container with ID starting with de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0 not found: ID does not exist" containerID="de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0" Mar 14 07:09:08 crc kubenswrapper[4713]: I0314 07:09:08.063720 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0"} err="failed to get container status \"de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0\": rpc error: code = NotFound desc = could not find container \"de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0\": container with ID starting with de4a2fdebefbc796182916fd1cbab08b820ab10dab4533b2faa9927aeac2d0b0 not found: ID does not exist" Mar 14 07:09:09 crc kubenswrapper[4713]: I0314 07:09:09.580264 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" path="/var/lib/kubelet/pods/762bc0d1-afca-4c67-9336-2855e56cb9e4/volumes" Mar 14 07:09:21 crc kubenswrapper[4713]: I0314 07:09:21.565059 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:09:21 crc kubenswrapper[4713]: E0314 07:09:21.565966 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:09:23 crc kubenswrapper[4713]: I0314 07:09:23.122930 4713 generic.go:334] "Generic (PLEG): container finished" podID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" containerID="defd450a42c66f6869b13ecc59759cd924433325ebb78b7a4edc28f026a44ce9" exitCode=0 Mar 14 07:09:23 crc kubenswrapper[4713]: I0314 07:09:23.123021 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" event={"ID":"26621cc1-fd4e-463b-aea6-65b2d9c4c85d","Type":"ContainerDied","Data":"defd450a42c66f6869b13ecc59759cd924433325ebb78b7a4edc28f026a44ce9"} Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.277627 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.325992 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-stc2l"] Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.340272 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-stc2l"] Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.363662 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcrh5\" (UniqueName: \"kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5\") pod \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.363962 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host\") pod \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\" (UID: \"26621cc1-fd4e-463b-aea6-65b2d9c4c85d\") " Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.364021 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host" (OuterVolumeSpecName: "host") pod "26621cc1-fd4e-463b-aea6-65b2d9c4c85d" (UID: "26621cc1-fd4e-463b-aea6-65b2d9c4c85d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.364775 4713 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-host\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.370746 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5" (OuterVolumeSpecName: "kube-api-access-kcrh5") pod "26621cc1-fd4e-463b-aea6-65b2d9c4c85d" (UID: "26621cc1-fd4e-463b-aea6-65b2d9c4c85d"). InnerVolumeSpecName "kube-api-access-kcrh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:09:24 crc kubenswrapper[4713]: I0314 07:09:24.467494 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcrh5\" (UniqueName: \"kubernetes.io/projected/26621cc1-fd4e-463b-aea6-65b2d9c4c85d-kube-api-access-kcrh5\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.155893 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2823f817535ad24b511b6ab0c1141da8334013c1faa9520f7f2885b6f80a0c8" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.156017 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-stc2l" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.580168 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" path="/var/lib/kubelet/pods/26621cc1-fd4e-463b-aea6-65b2d9c4c85d/volumes" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.580897 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-hz28j"] Mar 14 07:09:25 crc kubenswrapper[4713]: E0314 07:09:25.581415 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" containerName="container-00" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581437 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" containerName="container-00" Mar 14 07:09:25 crc kubenswrapper[4713]: E0314 07:09:25.581457 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="extract-utilities" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581465 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="extract-utilities" Mar 14 07:09:25 crc kubenswrapper[4713]: E0314 07:09:25.581475 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581481 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" Mar 14 07:09:25 crc kubenswrapper[4713]: E0314 07:09:25.581508 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="extract-content" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581513 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="extract-content" Mar 14 07:09:25 crc kubenswrapper[4713]: E0314 07:09:25.581522 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d08e70-3946-4a9a-9c14-1078698c2383" containerName="oc" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581528 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d08e70-3946-4a9a-9c14-1078698c2383" containerName="oc" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581774 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="26621cc1-fd4e-463b-aea6-65b2d9c4c85d" containerName="container-00" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581809 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d08e70-3946-4a9a-9c14-1078698c2383" containerName="oc" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.581824 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="762bc0d1-afca-4c67-9336-2855e56cb9e4" containerName="registry-server" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.582886 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.701309 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbf5\" (UniqueName: \"kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.701764 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.804648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.804813 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbf5\" (UniqueName: \"kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.804837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.826030 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbf5\" (UniqueName: \"kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5\") pod \"crc-debug-hz28j\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:25 crc kubenswrapper[4713]: I0314 07:09:25.904993 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:26 crc kubenswrapper[4713]: I0314 07:09:26.183065 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" event={"ID":"0db1008c-1e53-4ae6-a898-0cd653ea81ef","Type":"ContainerStarted","Data":"4c56e70d775e4e698496814b43b121be2ff3ce1a6f50d8ecb8efa34ca2efe3c8"} Mar 14 07:09:27 crc kubenswrapper[4713]: I0314 07:09:27.202072 4713 generic.go:334] "Generic (PLEG): container finished" podID="0db1008c-1e53-4ae6-a898-0cd653ea81ef" containerID="1a766e8cf4550dbd5ecdfaae3e67f4d0255371491a6684ec1acc665a8b9ea767" exitCode=0 Mar 14 07:09:27 crc kubenswrapper[4713]: I0314 07:09:27.202218 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" event={"ID":"0db1008c-1e53-4ae6-a898-0cd653ea81ef","Type":"ContainerDied","Data":"1a766e8cf4550dbd5ecdfaae3e67f4d0255371491a6684ec1acc665a8b9ea767"} Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.407637 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.490526 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqbf5\" (UniqueName: \"kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5\") pod \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.490667 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host\") pod \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\" (UID: \"0db1008c-1e53-4ae6-a898-0cd653ea81ef\") " Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.490792 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host" (OuterVolumeSpecName: "host") pod "0db1008c-1e53-4ae6-a898-0cd653ea81ef" (UID: "0db1008c-1e53-4ae6-a898-0cd653ea81ef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.491643 4713 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0db1008c-1e53-4ae6-a898-0cd653ea81ef-host\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.503476 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5" (OuterVolumeSpecName: "kube-api-access-lqbf5") pod "0db1008c-1e53-4ae6-a898-0cd653ea81ef" (UID: "0db1008c-1e53-4ae6-a898-0cd653ea81ef"). InnerVolumeSpecName "kube-api-access-lqbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:09:28 crc kubenswrapper[4713]: I0314 07:09:28.593465 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqbf5\" (UniqueName: \"kubernetes.io/projected/0db1008c-1e53-4ae6-a898-0cd653ea81ef-kube-api-access-lqbf5\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:29 crc kubenswrapper[4713]: I0314 07:09:29.157546 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-hz28j"] Mar 14 07:09:29 crc kubenswrapper[4713]: I0314 07:09:29.169990 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-hz28j"] Mar 14 07:09:29 crc kubenswrapper[4713]: I0314 07:09:29.238595 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c56e70d775e4e698496814b43b121be2ff3ce1a6f50d8ecb8efa34ca2efe3c8" Mar 14 07:09:29 crc kubenswrapper[4713]: I0314 07:09:29.238682 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-hz28j" Mar 14 07:09:29 crc kubenswrapper[4713]: I0314 07:09:29.577013 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db1008c-1e53-4ae6-a898-0cd653ea81ef" path="/var/lib/kubelet/pods/0db1008c-1e53-4ae6-a898-0cd653ea81ef/volumes" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.321170 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-h45jz"] Mar 14 07:09:30 crc kubenswrapper[4713]: E0314 07:09:30.321929 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db1008c-1e53-4ae6-a898-0cd653ea81ef" containerName="container-00" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.321946 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db1008c-1e53-4ae6-a898-0cd653ea81ef" containerName="container-00" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.322198 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db1008c-1e53-4ae6-a898-0cd653ea81ef" containerName="container-00" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.323337 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.449714 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.450132 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gn7r\" (UniqueName: \"kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.555236 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.555396 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gn7r\" (UniqueName: \"kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.555858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.576451 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gn7r\" (UniqueName: \"kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r\") pod \"crc-debug-h45jz\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:30 crc kubenswrapper[4713]: I0314 07:09:30.643786 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:31 crc kubenswrapper[4713]: E0314 07:09:31.131531 4713 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16ddee7f_7436_4db4_8ea5_ebe877b0ebeb.slice/crio-conmon-31337e0064c2fc582abb8a4c79ab87f5a0fd99b206d093c5b1ff9057be3b0144.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:09:31 crc kubenswrapper[4713]: I0314 07:09:31.264020 4713 generic.go:334] "Generic (PLEG): container finished" podID="16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" containerID="31337e0064c2fc582abb8a4c79ab87f5a0fd99b206d093c5b1ff9057be3b0144" exitCode=0 Mar 14 07:09:31 crc kubenswrapper[4713]: I0314 07:09:31.264355 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" event={"ID":"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb","Type":"ContainerDied","Data":"31337e0064c2fc582abb8a4c79ab87f5a0fd99b206d093c5b1ff9057be3b0144"} Mar 14 07:09:31 crc kubenswrapper[4713]: I0314 07:09:31.264413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" event={"ID":"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb","Type":"ContainerStarted","Data":"214d6e446ad1bf790de7d4c4d778a7deb6b8522de95b1d836a9b7f6aee0abd2b"} Mar 14 07:09:31 crc kubenswrapper[4713]: I0314 07:09:31.327517 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-h45jz"] Mar 14 07:09:31 crc kubenswrapper[4713]: I0314 07:09:31.344313 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8mfn2/crc-debug-h45jz"] Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.409382 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.499489 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host\") pod \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.499659 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host" (OuterVolumeSpecName: "host") pod "16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" (UID: "16ddee7f-7436-4db4-8ea5-ebe877b0ebeb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.499917 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gn7r\" (UniqueName: \"kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r\") pod \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\" (UID: \"16ddee7f-7436-4db4-8ea5-ebe877b0ebeb\") " Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.500982 4713 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-host\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.507490 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r" (OuterVolumeSpecName: "kube-api-access-8gn7r") pod "16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" (UID: "16ddee7f-7436-4db4-8ea5-ebe877b0ebeb"). InnerVolumeSpecName "kube-api-access-8gn7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:09:32 crc kubenswrapper[4713]: I0314 07:09:32.603981 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gn7r\" (UniqueName: \"kubernetes.io/projected/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb-kube-api-access-8gn7r\") on node \"crc\" DevicePath \"\"" Mar 14 07:09:33 crc kubenswrapper[4713]: I0314 07:09:33.297765 4713 scope.go:117] "RemoveContainer" containerID="31337e0064c2fc582abb8a4c79ab87f5a0fd99b206d093c5b1ff9057be3b0144" Mar 14 07:09:33 crc kubenswrapper[4713]: I0314 07:09:33.297950 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/crc-debug-h45jz" Mar 14 07:09:33 crc kubenswrapper[4713]: I0314 07:09:33.568181 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:09:33 crc kubenswrapper[4713]: E0314 07:09:33.570854 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:09:33 crc kubenswrapper[4713]: I0314 07:09:33.583754 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" path="/var/lib/kubelet/pods/16ddee7f-7436-4db4-8ea5-ebe877b0ebeb/volumes" Mar 14 07:09:48 crc kubenswrapper[4713]: I0314 07:09:48.564831 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:09:48 crc kubenswrapper[4713]: E0314 07:09:48.565715 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:09:59 crc kubenswrapper[4713]: I0314 07:09:59.925902 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0567902a-8618-4a33-b632-ef2b6555c113/aodh-api/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.133275 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0567902a-8618-4a33-b632-ef2b6555c113/aodh-evaluator/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.155149 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0567902a-8618-4a33-b632-ef2b6555c113/aodh-notifier/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.158875 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0567902a-8618-4a33-b632-ef2b6555c113/aodh-listener/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.182407 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557870-9jzfw"] Mar 14 07:10:00 crc kubenswrapper[4713]: E0314 07:10:00.183161 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" containerName="container-00" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.183265 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" containerName="container-00" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.183584 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ddee7f-7436-4db4-8ea5-ebe877b0ebeb" containerName="container-00" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.184607 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.187536 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.187602 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.187802 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.196501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-9jzfw"] Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.283341 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsgrz\" (UniqueName: \"kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz\") pod \"auto-csr-approver-29557870-9jzfw\" (UID: \"c3503770-8264-4526-9916-2fe62e985b1a\") " pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.360335 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66dc7cf97d-ll6gt_c0226c41-0d23-4ea8-b8ff-0f1b20a04f68/barbican-api/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.386460 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsgrz\" (UniqueName: \"kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz\") pod \"auto-csr-approver-29557870-9jzfw\" (UID: \"c3503770-8264-4526-9916-2fe62e985b1a\") " pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.409331 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66dc7cf97d-ll6gt_c0226c41-0d23-4ea8-b8ff-0f1b20a04f68/barbican-api-log/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.411172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsgrz\" (UniqueName: \"kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz\") pod \"auto-csr-approver-29557870-9jzfw\" (UID: \"c3503770-8264-4526-9916-2fe62e985b1a\") " pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.513619 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.625386 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f98f48554-4fr2x_d1f41851-9a76-4730-9535-113163dd38dc/barbican-keystone-listener/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.722962 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f98f48554-4fr2x_d1f41851-9a76-4730-9535-113163dd38dc/barbican-keystone-listener-log/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.749062 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c94555ff-7pxkm_b63291b9-15a4-43c2-ba17-be0374c459b5/barbican-worker/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.914278 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6c94555ff-7pxkm_b63291b9-15a4-43c2-ba17-be0374c459b5/barbican-worker-log/0.log" Mar 14 07:10:00 crc kubenswrapper[4713]: I0314 07:10:00.981057 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-gxlnt_399681c2-4d54-4329-9e80-55ae24289ee5/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.070169 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-9jzfw"] Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.212037 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2c79b18-2189-46d9-bbd4-55f58870d723/ceilometer-central-agent/1.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.281463 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2c79b18-2189-46d9-bbd4-55f58870d723/ceilometer-notification-agent/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.285707 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2c79b18-2189-46d9-bbd4-55f58870d723/proxy-httpd/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.342054 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2c79b18-2189-46d9-bbd4-55f58870d723/ceilometer-central-agent/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.480798 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2c79b18-2189-46d9-bbd4-55f58870d723/sg-core/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.563603 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:10:01 crc kubenswrapper[4713]: E0314 07:10:01.563947 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.586124 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d1b88956-89b2-49f5-881a-f757d005ee2a/cinder-api-log/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.637184 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d1b88956-89b2-49f5-881a-f757d005ee2a/cinder-api/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.672030 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" event={"ID":"c3503770-8264-4526-9916-2fe62e985b1a","Type":"ContainerStarted","Data":"cd011c20e77381b65bf7ab2eba2b61989ec9e84cde0c75867e17c0e2310194c7"} Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.798741 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b61a43-5015-4b52-b55f-4ea941db9a0d/cinder-scheduler/1.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.847406 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b61a43-5015-4b52-b55f-4ea941db9a0d/cinder-scheduler/0.log" Mar 14 07:10:01 crc kubenswrapper[4713]: I0314 07:10:01.900153 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b61a43-5015-4b52-b55f-4ea941db9a0d/probe/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.036280 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-txv7f_55f62410-5eca-443c-86b0-39b49d969e9f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.119653 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pxglp_1158aa83-e4b7-4231-a249-ee99e7f4d291/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.308640 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-bhxpb_4c2880bc-ddaa-44ac-81f5-05e29a7c05d0/init/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.581139 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-bhxpb_4c2880bc-ddaa-44ac-81f5-05e29a7c05d0/init/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.699724 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-bhxpb_4c2880bc-ddaa-44ac-81f5-05e29a7c05d0/dnsmasq-dns/0.log" Mar 14 07:10:02 crc kubenswrapper[4713]: I0314 07:10:02.729254 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mdfxk_58fd92f9-e4e0-4d3a-8df0-dc21754faea3/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.010255 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5171aada-64eb-4788-8446-346549791051/glance-httpd/0.log" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.026652 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5171aada-64eb-4788-8446-346549791051/glance-log/0.log" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.167797 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b6b4ad11-424d-4394-b809-9fb4e559e255/glance-httpd/0.log" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.284458 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b6b4ad11-424d-4394-b809-9fb4e559e255/glance-log/0.log" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.705990 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" event={"ID":"c3503770-8264-4526-9916-2fe62e985b1a","Type":"ContainerStarted","Data":"6adf5afcf9e5a56dc70b7f035b2c30e2aa7e991c4e18eb8bfef832acfcc71d60"} Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.733716 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" podStartSLOduration=2.458767825 podStartE2EDuration="3.733696896s" podCreationTimestamp="2026-03-14 07:10:00 +0000 UTC" firstStartedPulling="2026-03-14 07:10:01.074085246 +0000 UTC m=+6184.161994546" lastFinishedPulling="2026-03-14 07:10:02.349014317 +0000 UTC m=+6185.436923617" observedRunningTime="2026-03-14 07:10:03.722172336 +0000 UTC m=+6186.810081626" watchObservedRunningTime="2026-03-14 07:10:03.733696896 +0000 UTC m=+6186.821606186" Mar 14 07:10:03 crc kubenswrapper[4713]: I0314 07:10:03.981496 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-fflms_9ff839a7-6783-4526-a69b-66def7b3f8b4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.269524 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5mtz2_aca83833-8133-4064-b1b3-05989d69b5b0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.273127 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6d66999447-xvgnl_09cb708d-ad8b-4c22-9003-8e59fa88aa05/heat-engine/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.501030 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-784d6d7c98-s79xh_6367bdc4-f55d-4f50-8b15-d1a05ce279e1/heat-api/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.651134 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-56f6f749f8-hrbrr_6d4f1b6f-a408-4a91-9948-fe6bf54e13a9/heat-cfnapi/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.719189 4713 generic.go:334] "Generic (PLEG): container finished" podID="c3503770-8264-4526-9916-2fe62e985b1a" containerID="6adf5afcf9e5a56dc70b7f035b2c30e2aa7e991c4e18eb8bfef832acfcc71d60" exitCode=0 Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.719259 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" event={"ID":"c3503770-8264-4526-9916-2fe62e985b1a","Type":"ContainerDied","Data":"6adf5afcf9e5a56dc70b7f035b2c30e2aa7e991c4e18eb8bfef832acfcc71d60"} Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.763482 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557801-7h79h_38b13150-e5f8-4a0e-93a6-c0f07c7e600e/keystone-cron/0.log" Mar 14 07:10:04 crc kubenswrapper[4713]: I0314 07:10:04.927457 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557861-m89sp_dbd85a78-aca0-4dc4-ba3e-492b3bf749f5/keystone-cron/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.065045 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e0064af5-2496-4bcb-89b2-f9446d023d2b/kube-state-metrics/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.250671 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4t6lc_068a337b-3e10-4cdf-9883-6a9311bb4424/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.305033 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69d88696fb-hfdtr_91752d56-0175-41ab-8cea-a8b7ab4c55cf/keystone-api/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.320887 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-9mchp_63e52146-8f23-43ce-99dd-91c5c9f5b42d/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.511488 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_80149b83-1a15-44d7-be14-8bd3ae881e7e/mysqld-exporter/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.906940 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zc4zz_e89967bf-adf8-4756-9097-75e19857a93c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:05 crc kubenswrapper[4713]: I0314 07:10:05.988280 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854d4949-xljzd_2d93f914-fdbb-4acc-83f8-30effe510c7e/neutron-httpd/0.log" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.148275 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854d4949-xljzd_2d93f914-fdbb-4acc-83f8-30effe510c7e/neutron-api/0.log" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.243706 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.335413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsgrz\" (UniqueName: \"kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz\") pod \"c3503770-8264-4526-9916-2fe62e985b1a\" (UID: \"c3503770-8264-4526-9916-2fe62e985b1a\") " Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.364996 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz" (OuterVolumeSpecName: "kube-api-access-jsgrz") pod "c3503770-8264-4526-9916-2fe62e985b1a" (UID: "c3503770-8264-4526-9916-2fe62e985b1a"). InnerVolumeSpecName "kube-api-access-jsgrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.438324 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsgrz\" (UniqueName: \"kubernetes.io/projected/c3503770-8264-4526-9916-2fe62e985b1a-kube-api-access-jsgrz\") on node \"crc\" DevicePath \"\"" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.759385 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" event={"ID":"c3503770-8264-4526-9916-2fe62e985b1a","Type":"ContainerDied","Data":"cd011c20e77381b65bf7ab2eba2b61989ec9e84cde0c75867e17c0e2310194c7"} Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.759429 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd011c20e77381b65bf7ab2eba2b61989ec9e84cde0c75867e17c0e2310194c7" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.759484 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-9jzfw" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.789316 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d88f915d-d72f-4586-b435-67d75d24ecc0/nova-cell0-conductor-conductor/0.log" Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.839292 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-qpsjp"] Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.865763 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-qpsjp"] Mar 14 07:10:06 crc kubenswrapper[4713]: I0314 07:10:06.949018 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f0f4fde5-7672-47ca-9935-b0d5124f5b2d/nova-api-log/0.log" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.310336 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_94aaad06-2233-4127-9b26-d9fc2b6ff597/nova-cell1-conductor-conductor/0.log" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.473658 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bbb8a72f-747e-4a4b-942c-3487e6c2e476/nova-cell1-novncproxy-novncproxy/0.log" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.586431 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84c273f-a96b-49e0-aa6f-8c1808965f34" path="/var/lib/kubelet/pods/d84c273f-a96b-49e0-aa6f-8c1808965f34/volumes" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.610291 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-g4tcw_713308d3-fe7b-40f0-84b6-671a2defaf7b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.651932 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f0f4fde5-7672-47ca-9935-b0d5124f5b2d/nova-api-api/0.log" Mar 14 07:10:07 crc kubenswrapper[4713]: I0314 07:10:07.885354 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6019f4ac-3776-409b-ba3c-64d1739791a7/nova-metadata-log/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.135971 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b/mysql-bootstrap/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.136399 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d520c464-934b-4fee-b00c-e4f227de360e/nova-scheduler-scheduler/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.366814 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b/galera/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.370703 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02f1ca0a-6c7d-4e97-be8d-ac9cf2f3018b/mysql-bootstrap/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.582163 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6019f4ac-3776-409b-ba3c-64d1739791a7/nova-metadata-metadata/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.653715 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_051e4d6d-86dc-479f-a659-6f95b7baa817/mysql-bootstrap/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.781737 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_051e4d6d-86dc-479f-a659-6f95b7baa817/mysql-bootstrap/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.836935 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_051e4d6d-86dc-479f-a659-6f95b7baa817/galera/0.log" Mar 14 07:10:08 crc kubenswrapper[4713]: I0314 07:10:08.915172 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_051e4d6d-86dc-479f-a659-6f95b7baa817/galera/1.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.020333 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_869960ea-c2fe-4a61-8f70-2e7724af6426/openstackclient/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.175017 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lk79w_2d0098e9-c3b0-4e48-8abd-6b0e45dc47ca/ovn-controller/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.282264 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-shmpz_165fd5f2-33fe-4736-8787-75331305fc9b/openstack-network-exporter/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.376804 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6pzwm_1475dc78-ec5d-45b6-a21d-0e6e6320a012/ovsdb-server-init/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.638757 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6pzwm_1475dc78-ec5d-45b6-a21d-0e6e6320a012/ovsdb-server/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.654784 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6pzwm_1475dc78-ec5d-45b6-a21d-0e6e6320a012/ovs-vswitchd/0.log" Mar 14 07:10:09 crc kubenswrapper[4713]: I0314 07:10:09.664896 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6pzwm_1475dc78-ec5d-45b6-a21d-0e6e6320a012/ovsdb-server-init/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.139935 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hbkx8_93bee75a-f41d-4e0b-8e3e-3a2c8eb63c38/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.210642 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca274e3b-b1c1-4083-8a05-7b9a536fe088/openstack-network-exporter/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.252154 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca274e3b-b1c1-4083-8a05-7b9a536fe088/ovn-northd/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.509472 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d30edab6-1aa0-47d8-a20d-d2d2d0d6185d/openstack-network-exporter/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.512908 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d30edab6-1aa0-47d8-a20d-d2d2d0d6185d/ovsdbserver-nb/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.702930 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7/openstack-network-exporter/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.757844 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_92b8fdd2-6f8a-46d6-b301-d4e7e5aeb4f7/ovsdbserver-sb/0.log" Mar 14 07:10:10 crc kubenswrapper[4713]: I0314 07:10:10.974785 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59cfcdd844-lx8mr_51065c42-7604-4da1-8119-395c8c1ace81/placement-api/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.093450 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5e00395b-5b37-4ba4-a4e7-7ad08388b053/init-config-reloader/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.105070 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59cfcdd844-lx8mr_51065c42-7604-4da1-8119-395c8c1ace81/placement-log/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.261335 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5e00395b-5b37-4ba4-a4e7-7ad08388b053/init-config-reloader/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.320363 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5e00395b-5b37-4ba4-a4e7-7ad08388b053/thanos-sidecar/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.339432 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5e00395b-5b37-4ba4-a4e7-7ad08388b053/config-reloader/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.359821 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5e00395b-5b37-4ba4-a4e7-7ad08388b053/prometheus/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.519184 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b83bd95f-ad77-4c7a-9e24-5d2320c7823d/setup-container/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.784260 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b83bd95f-ad77-4c7a-9e24-5d2320c7823d/rabbitmq/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.810022 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b83bd95f-ad77-4c7a-9e24-5d2320c7823d/setup-container/0.log" Mar 14 07:10:11 crc kubenswrapper[4713]: I0314 07:10:11.850468 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_43aa4c5f-72e3-4b1c-8842-09c1af1abcc3/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.123271 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_43aa4c5f-72e3-4b1c-8842-09c1af1abcc3/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.130357 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_43aa4c5f-72e3-4b1c-8842-09c1af1abcc3/rabbitmq/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.184482 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_88fb9884-c3f2-4186-8161-159d30f0ee62/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.425542 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_88fb9884-c3f2-4186-8161-159d30f0ee62/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.486620 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8b7136fb-37b4-4b12-a917-37f2a708eedd/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.521376 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_88fb9884-c3f2-4186-8161-159d30f0ee62/rabbitmq/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.707445 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8b7136fb-37b4-4b12-a917-37f2a708eedd/setup-container/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.762101 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8b7136fb-37b4-4b12-a917-37f2a708eedd/rabbitmq/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.796628 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-476vs_b8311ebd-6052-4bcf-98a4-15cde102418b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:12 crc kubenswrapper[4713]: I0314 07:10:12.979175 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6m99t_b4f93731-ee2b-4013-87a8-0ce7a242f506/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.053505 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-d56w9_484a7a0d-8b23-4b9f-a875-843d1d9145a0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.239437 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9xnxz_1b5e5638-e71f-47c0-a136-7530a65e7053/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.371574 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j8xdw_cf1997f7-a846-45ea-bc72-6db299d42afe/ssh-known-hosts-edpm-deployment/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.578487 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d59d8dc5-c57fx_c945cc41-0bca-48e1-97b9-d0fb8085e3ca/proxy-server/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.710705 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qc8rx_1341c453-d963-4a43-a264-0f94dd02b7dd/swift-ring-rebalance/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.807280 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d59d8dc5-c57fx_c945cc41-0bca-48e1-97b9-d0fb8085e3ca/proxy-httpd/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.924192 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/account-auditor/0.log" Mar 14 07:10:13 crc kubenswrapper[4713]: I0314 07:10:13.964963 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/account-reaper/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.047406 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/account-replicator/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.123988 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/account-server/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.170915 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/container-auditor/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.258740 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/container-replicator/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.317362 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/container-updater/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.328660 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/container-server/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.446784 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/object-auditor/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.530099 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/object-expirer/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.598908 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/object-server/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.600574 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/object-replicator/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.662546 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/object-updater/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.821962 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/swift-recon-cron/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.845533 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80f03c3b-d224-4e9d-8e52-e0376b3f215f/rsync/0.log" Mar 14 07:10:14 crc kubenswrapper[4713]: I0314 07:10:14.985336 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-przw2_b1d72cf9-f971-476d-a917-bb56b1280ac0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:15 crc kubenswrapper[4713]: I0314 07:10:15.141586 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-sbjkh_f7835d4f-7f3b-4b5f-8a3f-b950ec203b95/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:15 crc kubenswrapper[4713]: I0314 07:10:15.420607 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1a497ede-a36f-4e68-a3d0-9998e7c4851b/test-operator-logs-container/0.log" Mar 14 07:10:15 crc kubenswrapper[4713]: I0314 07:10:15.563917 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:10:15 crc kubenswrapper[4713]: E0314 07:10:15.564376 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:10:15 crc kubenswrapper[4713]: I0314 07:10:15.654481 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x8zkb_acd65abe-8ba5-4743-b778-d18f74ca3f2b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 07:10:16 crc kubenswrapper[4713]: I0314 07:10:16.131004 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0236ca7c-fd1b-42f0-805c-8d53e34a3cc1/tempest-tests-tempest-tests-runner/0.log" Mar 14 07:10:29 crc kubenswrapper[4713]: I0314 07:10:29.499940 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c32295a7-c2a9-461b-a4eb-9d9eeb2fc645/memcached/0.log" Mar 14 07:10:29 crc kubenswrapper[4713]: I0314 07:10:29.564532 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:10:29 crc kubenswrapper[4713]: E0314 07:10:29.564859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:10:43 crc kubenswrapper[4713]: I0314 07:10:43.567166 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:10:43 crc kubenswrapper[4713]: E0314 07:10:43.569913 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.095536 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:10:45 crc kubenswrapper[4713]: E0314 07:10:45.096346 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3503770-8264-4526-9916-2fe62e985b1a" containerName="oc" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.096359 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3503770-8264-4526-9916-2fe62e985b1a" containerName="oc" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.096662 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3503770-8264-4526-9916-2fe62e985b1a" containerName="oc" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.098620 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.113195 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.173373 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/util/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.258471 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.258645 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4f8p\" (UniqueName: \"kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.259110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.361230 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.361349 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.361405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4f8p\" (UniqueName: \"kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.361974 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.363340 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.400108 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4f8p\" (UniqueName: \"kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p\") pod \"redhat-marketplace-h5dhz\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.423018 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.436239 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/util/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.480055 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/pull/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.525421 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/pull/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.913022 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/util/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.929912 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/extract/0.log" Mar 14 07:10:45 crc kubenswrapper[4713]: I0314 07:10:45.931048 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a7711328nnjz6_90fc82f6-b9df-45d9-bdc7-6eae42f17b64/pull/0.log" Mar 14 07:10:46 crc kubenswrapper[4713]: I0314 07:10:46.036774 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:10:46 crc kubenswrapper[4713]: I0314 07:10:46.277551 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerStarted","Data":"2e200648d8e46b5761cb6e82a6f31d0b767f37aa27589bd03efda20e743b0b6f"} Mar 14 07:10:46 crc kubenswrapper[4713]: I0314 07:10:46.418869 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-kq9dl_12eb62d0-8721-4482-b4a3-148a61cea029/manager/0.log" Mar 14 07:10:46 crc kubenswrapper[4713]: I0314 07:10:46.557456 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-j4474_fa62dff3-1643-4e94-b31a-d56b21a2327d/manager/0.log" Mar 14 07:10:46 crc kubenswrapper[4713]: I0314 07:10:46.932925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-zz92h_bae008e7-4329-4d30-9820-81daf4300f96/manager/0.log" Mar 14 07:10:47 crc kubenswrapper[4713]: I0314 07:10:47.198902 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-xvmqf_ca4c3a10-3f60-460b-ad21-258a757bf57c/manager/0.log" Mar 14 07:10:47 crc kubenswrapper[4713]: I0314 07:10:47.251964 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-j587n_6a712708-53c3-4854-9a45-3442ee780cdc/manager/0.log" Mar 14 07:10:47 crc kubenswrapper[4713]: I0314 07:10:47.305868 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerID="86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696" exitCode=0 Mar 14 07:10:47 crc kubenswrapper[4713]: I0314 07:10:47.305930 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerDied","Data":"86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696"} Mar 14 07:10:47 crc kubenswrapper[4713]: I0314 07:10:47.802174 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-6xshs_9da309ad-34cc-4b06-b166-c571b5a39825/manager/0.log" Mar 14 07:10:48 crc kubenswrapper[4713]: I0314 07:10:48.058239 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-fm6fr_cbc588fa-b052-4336-81fe-2fed809e251b/manager/0.log" Mar 14 07:10:48 crc kubenswrapper[4713]: I0314 07:10:48.224282 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-shvlw_46274028-feea-4c48-b086-44533fc3e996/manager/0.log" Mar 14 07:10:48 crc kubenswrapper[4713]: I0314 07:10:48.405550 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-gbrmh_ccd30d62-2e42-4399-b1eb-dfde3782dcb8/manager/0.log" Mar 14 07:10:48 crc kubenswrapper[4713]: I0314 07:10:48.730693 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-46gkk_149dd450-69f3-4d71-aac3-90052dcf2253/manager/0.log" Mar 14 07:10:48 crc kubenswrapper[4713]: I0314 07:10:48.782560 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-6j4tq_4128f2c6-d929-4815-8502-291baf22f24f/manager/0.log" Mar 14 07:10:49 crc kubenswrapper[4713]: I0314 07:10:49.011148 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-7t4g8_d5c6be47-5c06-46e0-ae8c-87b7a3f23561/manager/0.log" Mar 14 07:10:49 crc kubenswrapper[4713]: I0314 07:10:49.233135 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-8mm88_409a2a8b-7e66-4763-9698-3a909f051c50/manager/0.log" Mar 14 07:10:49 crc kubenswrapper[4713]: I0314 07:10:49.317236 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-nfgb5_50d43641-0638-4763-9123-0c0c2c76629e/manager/0.log" Mar 14 07:10:49 crc kubenswrapper[4713]: I0314 07:10:49.337455 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerStarted","Data":"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d"} Mar 14 07:10:49 crc kubenswrapper[4713]: I0314 07:10:49.552656 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7bc7k2_4da1ed21-82a5-400c-a201-653fe58adf4c/manager/0.log" Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.064965 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-64f68cccc7-5r6v2_129ebe3f-95aa-42f1-8f56-1d3120fb5419/operator/0.log" Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.104519 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-44kkb_5a437700-77f6-4838-9a7d-89eda8a27afa/registry-server/1.log" Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.322505 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-44kkb_5a437700-77f6-4838-9a7d-89eda8a27afa/registry-server/0.log" Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.369756 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerID="b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d" exitCode=0 Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.369809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerDied","Data":"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d"} Mar 14 07:10:50 crc kubenswrapper[4713]: I0314 07:10:50.547812 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-kv59d_a55d0754-702d-4dbc-995a-b98d852678ce/manager/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.015993 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mlh55_9be88166-e4c0-464c-9dc0-a8a51595c555/operator/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.025423 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-xmbz2_aa4ff369-f2af-439f-b9f6-2c8301e80210/manager/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.346402 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-2r2fz_383e8493-0661-4b45-a72c-5851b520c65b/manager/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.398812 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerStarted","Data":"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1"} Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.423175 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5dhz" podStartSLOduration=2.928771764 podStartE2EDuration="6.423154883s" podCreationTimestamp="2026-03-14 07:10:45 +0000 UTC" firstStartedPulling="2026-03-14 07:10:47.312021066 +0000 UTC m=+6230.399930366" lastFinishedPulling="2026-03-14 07:10:50.806404185 +0000 UTC m=+6233.894313485" observedRunningTime="2026-03-14 07:10:51.421180431 +0000 UTC m=+6234.509089741" watchObservedRunningTime="2026-03-14 07:10:51.423154883 +0000 UTC m=+6234.511064183" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.866362 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75b7bc4c47-ltr87_4101fac4-706c-4e2b-9203-102d0874c3ba/manager/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.957866 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-6lw5h_fd0d55e7-5a0e-4f31-a75e-ed0b72bfa4dd/manager/0.log" Mar 14 07:10:51 crc kubenswrapper[4713]: I0314 07:10:51.997119 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85d9999fbb-kdnkw_92164fd9-b08c-4b00-975c-0fcdd245f8f9/manager/0.log" Mar 14 07:10:52 crc kubenswrapper[4713]: I0314 07:10:52.071125 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-pqnf9_3941b4bd-470d-4351-aed9-4bc1f90f9ad4/manager/0.log" Mar 14 07:10:55 crc kubenswrapper[4713]: I0314 07:10:55.424341 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:55 crc kubenswrapper[4713]: I0314 07:10:55.425463 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:55 crc kubenswrapper[4713]: I0314 07:10:55.487068 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:10:57 crc kubenswrapper[4713]: I0314 07:10:57.396042 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:10:57 crc kubenswrapper[4713]: E0314 07:10:57.396387 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:10:58 crc kubenswrapper[4713]: I0314 07:10:58.689121 4713 scope.go:117] "RemoveContainer" containerID="36c4f910c4e3a9c6398f0f29d15f2ffd8717565ce2eca9c78ae30e7881f7afa4" Mar 14 07:11:05 crc kubenswrapper[4713]: I0314 07:11:05.482556 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:11:05 crc kubenswrapper[4713]: I0314 07:11:05.552401 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:11:05 crc kubenswrapper[4713]: I0314 07:11:05.552660 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5dhz" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="registry-server" containerID="cri-o://1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1" gracePeriod=2 Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.227310 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.281946 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities\") pod \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.282144 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content\") pod \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.282248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4f8p\" (UniqueName: \"kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p\") pod \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\" (UID: \"3f76cb15-b0f0-4fda-9d6f-739169ee6f34\") " Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.282907 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities" (OuterVolumeSpecName: "utilities") pod "3f76cb15-b0f0-4fda-9d6f-739169ee6f34" (UID: "3f76cb15-b0f0-4fda-9d6f-739169ee6f34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.292572 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p" (OuterVolumeSpecName: "kube-api-access-q4f8p") pod "3f76cb15-b0f0-4fda-9d6f-739169ee6f34" (UID: "3f76cb15-b0f0-4fda-9d6f-739169ee6f34"). InnerVolumeSpecName "kube-api-access-q4f8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.311628 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f76cb15-b0f0-4fda-9d6f-739169ee6f34" (UID: "3f76cb15-b0f0-4fda-9d6f-739169ee6f34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.385639 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.385676 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.385687 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4f8p\" (UniqueName: \"kubernetes.io/projected/3f76cb15-b0f0-4fda-9d6f-739169ee6f34-kube-api-access-q4f8p\") on node \"crc\" DevicePath \"\"" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.532718 4713 generic.go:334] "Generic (PLEG): container finished" podID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerID="1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1" exitCode=0 Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.532781 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5dhz" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.532797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerDied","Data":"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1"} Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.533243 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5dhz" event={"ID":"3f76cb15-b0f0-4fda-9d6f-739169ee6f34","Type":"ContainerDied","Data":"2e200648d8e46b5761cb6e82a6f31d0b767f37aa27589bd03efda20e743b0b6f"} Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.533265 4713 scope.go:117] "RemoveContainer" containerID="1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.577197 4713 scope.go:117] "RemoveContainer" containerID="b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.594871 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.608604 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5dhz"] Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.624905 4713 scope.go:117] "RemoveContainer" containerID="86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.677452 4713 scope.go:117] "RemoveContainer" containerID="1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1" Mar 14 07:11:06 crc kubenswrapper[4713]: E0314 07:11:06.678621 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1\": container with ID starting with 1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1 not found: ID does not exist" containerID="1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.678848 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1"} err="failed to get container status \"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1\": rpc error: code = NotFound desc = could not find container \"1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1\": container with ID starting with 1bb992032fd716abc4cc97c922e56d2be59dae936e45b9ffad524f59c0f017f1 not found: ID does not exist" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.678875 4713 scope.go:117] "RemoveContainer" containerID="b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d" Mar 14 07:11:06 crc kubenswrapper[4713]: E0314 07:11:06.679338 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d\": container with ID starting with b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d not found: ID does not exist" containerID="b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.679389 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d"} err="failed to get container status \"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d\": rpc error: code = NotFound desc = could not find container \"b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d\": container with ID starting with b9da67f24d9d4980fb2165ee0ecd2ec4a182cb1823e874353dd2fb6bed535e2d not found: ID does not exist" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.679417 4713 scope.go:117] "RemoveContainer" containerID="86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696" Mar 14 07:11:06 crc kubenswrapper[4713]: E0314 07:11:06.679736 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696\": container with ID starting with 86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696 not found: ID does not exist" containerID="86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696" Mar 14 07:11:06 crc kubenswrapper[4713]: I0314 07:11:06.679775 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696"} err="failed to get container status \"86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696\": rpc error: code = NotFound desc = could not find container \"86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696\": container with ID starting with 86c12fb4125c774fbf9bee3dd015a5eb2222386adad083782ee08eec9c5c1696 not found: ID does not exist" Mar 14 07:11:07 crc kubenswrapper[4713]: I0314 07:11:07.601698 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" path="/var/lib/kubelet/pods/3f76cb15-b0f0-4fda-9d6f-739169ee6f34/volumes" Mar 14 07:11:10 crc kubenswrapper[4713]: I0314 07:11:10.565502 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:11:10 crc kubenswrapper[4713]: E0314 07:11:10.566433 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:11:18 crc kubenswrapper[4713]: I0314 07:11:18.001475 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v9q67_63d1129b-c978-4cb1-a73f-e09786113590/control-plane-machine-set-operator/0.log" Mar 14 07:11:18 crc kubenswrapper[4713]: I0314 07:11:18.339439 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vmsm7_80e7a49a-8aa3-41d2-b1bf-74a0689fbee6/kube-rbac-proxy/0.log" Mar 14 07:11:18 crc kubenswrapper[4713]: I0314 07:11:18.380336 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vmsm7_80e7a49a-8aa3-41d2-b1bf-74a0689fbee6/machine-api-operator/0.log" Mar 14 07:11:21 crc kubenswrapper[4713]: I0314 07:11:21.564696 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:11:21 crc kubenswrapper[4713]: E0314 07:11:21.565578 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:11:32 crc kubenswrapper[4713]: I0314 07:11:32.180302 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ljr8h_7df19eb2-dee5-4d0e-a141-fd7076e3b2a4/cert-manager-controller/0.log" Mar 14 07:11:32 crc kubenswrapper[4713]: I0314 07:11:32.310629 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-hjkpk_b0927073-aaac-4b3e-93e6-160c866785ad/cert-manager-cainjector/0.log" Mar 14 07:11:32 crc kubenswrapper[4713]: I0314 07:11:32.404703 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-xdfql_6dc92be5-3e9b-4e15-8eaa-a1a7ef51b21c/cert-manager-webhook/0.log" Mar 14 07:11:32 crc kubenswrapper[4713]: I0314 07:11:32.564235 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:11:32 crc kubenswrapper[4713]: E0314 07:11:32.564791 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:11:45 crc kubenswrapper[4713]: I0314 07:11:45.684883 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-bq9hp_d5bad799-6929-4d8a-ab6e-7463d787e8e0/nmstate-console-plugin/0.log" Mar 14 07:11:45 crc kubenswrapper[4713]: I0314 07:11:45.950711 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cdmmg_0eb6e7a3-da24-4bd6-8850-db445903fc2a/nmstate-handler/0.log" Mar 14 07:11:45 crc kubenswrapper[4713]: I0314 07:11:45.985804 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xscpb_01d732e4-c4d2-4b34-8335-8f5f9b2299cd/kube-rbac-proxy/0.log" Mar 14 07:11:46 crc kubenswrapper[4713]: I0314 07:11:46.201521 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-t6jvc_0069d74d-74f2-4d61-b37f-d710febf8c1e/nmstate-operator/0.log" Mar 14 07:11:46 crc kubenswrapper[4713]: I0314 07:11:46.210649 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xscpb_01d732e4-c4d2-4b34-8335-8f5f9b2299cd/nmstate-metrics/0.log" Mar 14 07:11:46 crc kubenswrapper[4713]: I0314 07:11:46.341264 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-25r25_a3c3dff8-a2ea-4073-a6ca-c391aaf296d0/nmstate-webhook/0.log" Mar 14 07:11:46 crc kubenswrapper[4713]: I0314 07:11:46.564195 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:11:46 crc kubenswrapper[4713]: E0314 07:11:46.564474 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:11:57 crc kubenswrapper[4713]: I0314 07:11:57.572899 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:11:57 crc kubenswrapper[4713]: E0314 07:11:57.573627 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:11:59 crc kubenswrapper[4713]: I0314 07:11:59.396739 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-66945dfc9f-xqf5p_aec4bfff-0bef-401b-9db6-f9046825614a/kube-rbac-proxy/0.log" Mar 14 07:11:59 crc kubenswrapper[4713]: I0314 07:11:59.464965 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-66945dfc9f-xqf5p_aec4bfff-0bef-401b-9db6-f9046825614a/manager/0.log" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.146170 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557872-f7jcd"] Mar 14 07:12:00 crc kubenswrapper[4713]: E0314 07:12:00.146897 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="extract-utilities" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.146918 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="extract-utilities" Mar 14 07:12:00 crc kubenswrapper[4713]: E0314 07:12:00.146929 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="extract-content" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.146937 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="extract-content" Mar 14 07:12:00 crc kubenswrapper[4713]: E0314 07:12:00.146952 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="registry-server" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.146959 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="registry-server" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.147233 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f76cb15-b0f0-4fda-9d6f-739169ee6f34" containerName="registry-server" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.148078 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.155162 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.155779 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.157038 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-f7jcd"] Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.157937 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.227489 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zgp\" (UniqueName: \"kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp\") pod \"auto-csr-approver-29557872-f7jcd\" (UID: \"b46cfb8e-338f-4281-84a1-59c3c42b7841\") " pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.329612 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zgp\" (UniqueName: \"kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp\") pod \"auto-csr-approver-29557872-f7jcd\" (UID: \"b46cfb8e-338f-4281-84a1-59c3c42b7841\") " pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.353292 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zgp\" (UniqueName: \"kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp\") pod \"auto-csr-approver-29557872-f7jcd\" (UID: \"b46cfb8e-338f-4281-84a1-59c3c42b7841\") " pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.472696 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:00 crc kubenswrapper[4713]: I0314 07:12:00.971905 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-f7jcd"] Mar 14 07:12:01 crc kubenswrapper[4713]: I0314 07:12:01.148107 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" event={"ID":"b46cfb8e-338f-4281-84a1-59c3c42b7841","Type":"ContainerStarted","Data":"499d38e8ee5691f3e2d466179de1ab85ea99e70411ee036c92d010e0da52336d"} Mar 14 07:12:03 crc kubenswrapper[4713]: I0314 07:12:03.185720 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" event={"ID":"b46cfb8e-338f-4281-84a1-59c3c42b7841","Type":"ContainerStarted","Data":"630b70f850b6ba80c46b43d28c760bae5130e755d7f04c821de548c78ab9d62a"} Mar 14 07:12:03 crc kubenswrapper[4713]: I0314 07:12:03.313978 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" podStartSLOduration=2.2732207300000002 podStartE2EDuration="3.313943114s" podCreationTimestamp="2026-03-14 07:12:00 +0000 UTC" firstStartedPulling="2026-03-14 07:12:00.982395893 +0000 UTC m=+6304.070305193" lastFinishedPulling="2026-03-14 07:12:02.023118277 +0000 UTC m=+6305.111027577" observedRunningTime="2026-03-14 07:12:03.228583608 +0000 UTC m=+6306.316492908" watchObservedRunningTime="2026-03-14 07:12:03.313943114 +0000 UTC m=+6306.401852404" Mar 14 07:12:04 crc kubenswrapper[4713]: I0314 07:12:04.199091 4713 generic.go:334] "Generic (PLEG): container finished" podID="b46cfb8e-338f-4281-84a1-59c3c42b7841" containerID="630b70f850b6ba80c46b43d28c760bae5130e755d7f04c821de548c78ab9d62a" exitCode=0 Mar 14 07:12:04 crc kubenswrapper[4713]: I0314 07:12:04.199437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" event={"ID":"b46cfb8e-338f-4281-84a1-59c3c42b7841","Type":"ContainerDied","Data":"630b70f850b6ba80c46b43d28c760bae5130e755d7f04c821de548c78ab9d62a"} Mar 14 07:12:05 crc kubenswrapper[4713]: I0314 07:12:05.792892 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:05 crc kubenswrapper[4713]: I0314 07:12:05.932579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zgp\" (UniqueName: \"kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp\") pod \"b46cfb8e-338f-4281-84a1-59c3c42b7841\" (UID: \"b46cfb8e-338f-4281-84a1-59c3c42b7841\") " Mar 14 07:12:05 crc kubenswrapper[4713]: I0314 07:12:05.940429 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp" (OuterVolumeSpecName: "kube-api-access-l5zgp") pod "b46cfb8e-338f-4281-84a1-59c3c42b7841" (UID: "b46cfb8e-338f-4281-84a1-59c3c42b7841"). InnerVolumeSpecName "kube-api-access-l5zgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.035910 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zgp\" (UniqueName: \"kubernetes.io/projected/b46cfb8e-338f-4281-84a1-59c3c42b7841-kube-api-access-l5zgp\") on node \"crc\" DevicePath \"\"" Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.225625 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" event={"ID":"b46cfb8e-338f-4281-84a1-59c3c42b7841","Type":"ContainerDied","Data":"499d38e8ee5691f3e2d466179de1ab85ea99e70411ee036c92d010e0da52336d"} Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.225664 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499d38e8ee5691f3e2d466179de1ab85ea99e70411ee036c92d010e0da52336d" Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.225936 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-f7jcd" Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.878350 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-dmt55"] Mar 14 07:12:06 crc kubenswrapper[4713]: I0314 07:12:06.888477 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-dmt55"] Mar 14 07:12:07 crc kubenswrapper[4713]: I0314 07:12:07.578936 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b57c93-ee45-43fa-b26d-5da0a6342431" path="/var/lib/kubelet/pods/50b57c93-ee45-43fa-b26d-5da0a6342431/volumes" Mar 14 07:12:09 crc kubenswrapper[4713]: I0314 07:12:09.564571 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:12:09 crc kubenswrapper[4713]: E0314 07:12:09.565407 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:12:14 crc kubenswrapper[4713]: I0314 07:12:14.570382 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kdwsl_428d4860-9850-46cc-82c8-5fcf46c06748/prometheus-operator/0.log" Mar 14 07:12:14 crc kubenswrapper[4713]: I0314 07:12:14.751672 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_8f90d6ef-cc15-4d38-a2c5-f5e778500c73/prometheus-operator-admission-webhook/0.log" Mar 14 07:12:14 crc kubenswrapper[4713]: I0314 07:12:14.842245 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_9bce1139-ffd0-4518-8f2c-7ef46b2892e4/prometheus-operator-admission-webhook/0.log" Mar 14 07:12:15 crc kubenswrapper[4713]: I0314 07:12:15.085602 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-wnccn_94841b22-b2eb-4519-b04f-98010d848b46/operator/0.log" Mar 14 07:12:15 crc kubenswrapper[4713]: I0314 07:12:15.091671 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-hj46d_265f475a-c481-4f65-a176-b3d4c55c691d/observability-ui-dashboards/0.log" Mar 14 07:12:15 crc kubenswrapper[4713]: I0314 07:12:15.235671 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hqw7p_f9178880-ef43-43c5-8e91-f4c46d4aa0c6/perses-operator/0.log" Mar 14 07:12:20 crc kubenswrapper[4713]: I0314 07:12:20.564164 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:12:20 crc kubenswrapper[4713]: E0314 07:12:20.564953 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:12:31 crc kubenswrapper[4713]: I0314 07:12:31.777085 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-f7fr7_6b3214ee-eee2-4b89-8491-2255fd1068be/cluster-logging-operator/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.038257 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-7q2cm_3ee29a7a-b0fa-4cc6-9f7a-eca15d018d3f/collector/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.101134 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_fd0e6ea3-0887-4eba-a83d-f76a405b0d56/loki-compactor/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.255291 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-h7xw6_82a7870b-ac91-41f9-a94f-41db191e711b/loki-distributor/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.333765 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54c568c9c8-98zs2_d77ba467-d131-42b6-9297-e30cbb7d9c57/gateway/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.414523 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54c568c9c8-98zs2_d77ba467-d131-42b6-9297-e30cbb7d9c57/opa/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.544636 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54c568c9c8-mwlnd_8eed3eb1-25e3-4d02-b5fd-d8f691af6c21/gateway/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.645379 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-54c568c9c8-mwlnd_8eed3eb1-25e3-4d02-b5fd-d8f691af6c21/opa/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.803531 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_ac2f3622-77e0-46da-95dc-1a17548790a7/loki-index-gateway/0.log" Mar 14 07:12:32 crc kubenswrapper[4713]: I0314 07:12:32.987525 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_f04ba68c-50bf-406f-977f-7cf9b7d1f4b4/loki-ingester/0.log" Mar 14 07:12:33 crc kubenswrapper[4713]: I0314 07:12:33.087329 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-rbp8f_ed72a9eb-a4ee-430c-9449-566f2c56c3bf/loki-querier/0.log" Mar 14 07:12:33 crc kubenswrapper[4713]: I0314 07:12:33.281297 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-5d4kx_1d1f7414-07c8-48ab-bc8b-3892473aa10f/loki-query-frontend/0.log" Mar 14 07:12:34 crc kubenswrapper[4713]: I0314 07:12:34.565582 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:12:34 crc kubenswrapper[4713]: E0314 07:12:34.566094 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:12:48 crc kubenswrapper[4713]: I0314 07:12:48.565256 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:12:49 crc kubenswrapper[4713]: I0314 07:12:49.796748 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a"} Mar 14 07:12:50 crc kubenswrapper[4713]: I0314 07:12:50.467637 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-knjfw_9640b5fe-f2ba-4a12-b456-1643ddc063f2/kube-rbac-proxy/0.log" Mar 14 07:12:50 crc kubenswrapper[4713]: I0314 07:12:50.654367 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-knjfw_9640b5fe-f2ba-4a12-b456-1643ddc063f2/controller/0.log" Mar 14 07:12:50 crc kubenswrapper[4713]: I0314 07:12:50.735702 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-frr-files/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.029044 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-reloader/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.069414 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-frr-files/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.109393 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-reloader/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.110228 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-metrics/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.328373 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-reloader/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.357734 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-frr-files/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.414433 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-metrics/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.419136 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-metrics/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.562006 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-frr-files/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.578465 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-reloader/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.666325 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/cp-metrics/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.688133 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/controller/1.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.828222 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/controller/0.log" Mar 14 07:12:51 crc kubenswrapper[4713]: I0314 07:12:51.982258 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/frr-metrics/0.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.093940 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/kube-rbac-proxy/0.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.242159 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/kube-rbac-proxy-frr/0.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.248483 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/frr/1.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.361689 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/reloader/0.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.574925 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-84hc6_4353213b-b89f-4288-babb-7afef0ca216a/frr-k8s-webhook-server/1.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.773572 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-84hc6_4353213b-b89f-4288-babb-7afef0ca216a/frr-k8s-webhook-server/0.log" Mar 14 07:12:52 crc kubenswrapper[4713]: I0314 07:12:52.973662 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84b689b795-q7lfp_6c7267d1-1d86-4ea3-91c6-5edc53bdfe01/manager/0.log" Mar 14 07:12:53 crc kubenswrapper[4713]: I0314 07:12:53.163528 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b665cf668-zl2jw_8440ca7f-e5cd-4deb-9e52-8be733b65583/webhook-server/1.log" Mar 14 07:12:53 crc kubenswrapper[4713]: I0314 07:12:53.244710 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b665cf668-zl2jw_8440ca7f-e5cd-4deb-9e52-8be733b65583/webhook-server/0.log" Mar 14 07:12:53 crc kubenswrapper[4713]: I0314 07:12:53.429301 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cl7ll_a5cbbe27-0738-4819-a4bc-5bc7d2945248/kube-rbac-proxy/0.log" Mar 14 07:12:54 crc kubenswrapper[4713]: I0314 07:12:54.367957 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cl7ll_a5cbbe27-0738-4819-a4bc-5bc7d2945248/speaker/0.log" Mar 14 07:12:54 crc kubenswrapper[4713]: I0314 07:12:54.597175 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-64h8p_fb9f27cd-ac40-407e-b9a5-f9594122604f/frr/0.log" Mar 14 07:12:58 crc kubenswrapper[4713]: I0314 07:12:58.859523 4713 scope.go:117] "RemoveContainer" containerID="79be3d679d4460e53c6b426360b3562b94c3321d3ff25a0cf794bbbe316c218d" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.577807 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/util/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.675316 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/util/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.736872 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/pull/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.794872 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/pull/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.968927 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/pull/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.975684 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/util/0.log" Mar 14 07:13:07 crc kubenswrapper[4713]: I0314 07:13:07.985483 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874b7rqx_6fc0ef8b-404f-4614-9bb1-57550994c275/extract/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.191303 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/util/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.349690 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/util/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.356375 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/pull/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.405362 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/pull/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.582785 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/pull/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.587350 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/util/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.588008 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdffv_2c3f03b0-9008-47a3-8fbe-7d4366757e02/extract/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.776976 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/util/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.955759 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/pull/0.log" Mar 14 07:13:08 crc kubenswrapper[4713]: I0314 07:13:08.964463 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/util/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.017610 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/pull/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.179312 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/extract/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.197406 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/pull/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.203333 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d55bvww_aff57dd5-95fe-4005-bdaf-20d8516df1d4/util/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.356766 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/util/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.585688 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/pull/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.591283 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/pull/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.606820 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/util/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.808588 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/util/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.817045 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/extract/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.866075 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cqxhh4_6b454723-8a49-4be7-87fd-ca93753c8c91/pull/0.log" Mar 14 07:13:09 crc kubenswrapper[4713]: I0314 07:13:09.999270 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/util/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.169099 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/pull/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.170597 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/util/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.189056 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/pull/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.412921 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/util/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.414932 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/pull/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.454609 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08crqdp_a917bf05-be84-43b9-ae72-d8b59822aeaf/extract/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.610290 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-utilities/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.771173 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-content/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.784115 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-content/0.log" Mar 14 07:13:10 crc kubenswrapper[4713]: I0314 07:13:10.808153 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-utilities/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.071910 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-utilities/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.094863 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/extract-content/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.384538 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-utilities/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.582625 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-utilities/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.599070 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-content/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.673038 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-content/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.898447 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-utilities/0.log" Mar 14 07:13:11 crc kubenswrapper[4713]: I0314 07:13:11.901153 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/extract-content/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.156924 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bzczw_2c50b2f7-7be4-4125-94ac-525d908a9e86/marketplace-operator/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.184990 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-utilities/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.371385 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-content/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.458591 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-content/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.465706 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-utilities/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.540490 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-prgds_58f78f5a-d3da-4bf6-bf82-c98dbbe9602f/registry-server/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.684557 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-utilities/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.785320 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/extract-content/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.826475 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vmvfw_e412202e-9dd7-4ebb-90a4-c25cbf3241b8/registry-server/0.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.922584 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/registry-server/1.log" Mar 14 07:13:12 crc kubenswrapper[4713]: I0314 07:13:12.965193 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qhnqn_fc4d2c5d-cf64-489f-9229-3e79a6e369c3/registry-server/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.053539 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-utilities/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.239861 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-content/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.298247 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-content/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.303036 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-utilities/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.449004 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-utilities/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.525927 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/extract-content/0.log" Mar 14 07:13:13 crc kubenswrapper[4713]: I0314 07:13:13.858875 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/registry-server/1.log" Mar 14 07:13:14 crc kubenswrapper[4713]: I0314 07:13:14.365945 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-525cq_d675a43d-ebc9-4ad8-92c6-89ecb59ea8fd/registry-server/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.447265 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-6f2sl_9bce1139-ffd0-4518-8f2c-7ef46b2892e4/prometheus-operator-admission-webhook/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.461737 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kdwsl_428d4860-9850-46cc-82c8-5fcf46c06748/prometheus-operator/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.545595 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7bb6d8b7c8-2xv4d_8f90d6ef-cc15-4d38-a2c5-f5e778500c73/prometheus-operator-admission-webhook/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.679862 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-hj46d_265f475a-c481-4f65-a176-b3d4c55c691d/observability-ui-dashboards/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.698784 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-wnccn_94841b22-b2eb-4519-b04f-98010d848b46/operator/0.log" Mar 14 07:13:26 crc kubenswrapper[4713]: I0314 07:13:26.762541 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hqw7p_f9178880-ef43-43c5-8e91-f4c46d4aa0c6/perses-operator/0.log" Mar 14 07:13:41 crc kubenswrapper[4713]: I0314 07:13:41.158059 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-66945dfc9f-xqf5p_aec4bfff-0bef-401b-9db6-f9046825614a/manager/0.log" Mar 14 07:13:41 crc kubenswrapper[4713]: I0314 07:13:41.184357 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-66945dfc9f-xqf5p_aec4bfff-0bef-401b-9db6-f9046825614a/kube-rbac-proxy/0.log" Mar 14 07:13:52 crc kubenswrapper[4713]: E0314 07:13:52.991768 4713 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:44076->38.102.83.106:35023: write tcp 38.102.83.106:44076->38.102.83.106:35023: write: broken pipe Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.691464 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:13:57 crc kubenswrapper[4713]: E0314 07:13:57.702783 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46cfb8e-338f-4281-84a1-59c3c42b7841" containerName="oc" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.702802 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46cfb8e-338f-4281-84a1-59c3c42b7841" containerName="oc" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.703079 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46cfb8e-338f-4281-84a1-59c3c42b7841" containerName="oc" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.704827 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.704916 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.862348 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.862979 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.863097 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697nm\" (UniqueName: \"kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.965297 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.965451 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697nm\" (UniqueName: \"kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.965484 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.967700 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:57 crc kubenswrapper[4713]: I0314 07:13:57.968130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:58 crc kubenswrapper[4713]: I0314 07:13:58.012948 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697nm\" (UniqueName: \"kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm\") pod \"certified-operators-486lp\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:58 crc kubenswrapper[4713]: I0314 07:13:58.036419 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:13:59 crc kubenswrapper[4713]: I0314 07:13:59.114434 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:13:59 crc kubenswrapper[4713]: I0314 07:13:59.735896 4713 generic.go:334] "Generic (PLEG): container finished" podID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerID="838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235" exitCode=0 Mar 14 07:13:59 crc kubenswrapper[4713]: I0314 07:13:59.736439 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerDied","Data":"838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235"} Mar 14 07:13:59 crc kubenswrapper[4713]: I0314 07:13:59.736489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerStarted","Data":"019c92a944b7e7b783005b76ee04a0d31ae983b96be4d93105a91d00b89a7331"} Mar 14 07:13:59 crc kubenswrapper[4713]: I0314 07:13:59.750695 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.175513 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557874-28kfl"] Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.178320 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.187959 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.188181 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.188339 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.212248 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-28kfl"] Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.325029 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlkhq\" (UniqueName: \"kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq\") pod \"auto-csr-approver-29557874-28kfl\" (UID: \"f7b394d3-02cf-407e-a196-7da3e7f759f8\") " pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.427749 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlkhq\" (UniqueName: \"kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq\") pod \"auto-csr-approver-29557874-28kfl\" (UID: \"f7b394d3-02cf-407e-a196-7da3e7f759f8\") " pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.452123 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlkhq\" (UniqueName: \"kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq\") pod \"auto-csr-approver-29557874-28kfl\" (UID: \"f7b394d3-02cf-407e-a196-7da3e7f759f8\") " pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:00 crc kubenswrapper[4713]: I0314 07:14:00.506356 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:01 crc kubenswrapper[4713]: I0314 07:14:01.213492 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-28kfl"] Mar 14 07:14:01 crc kubenswrapper[4713]: I0314 07:14:01.810369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerStarted","Data":"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241"} Mar 14 07:14:01 crc kubenswrapper[4713]: I0314 07:14:01.812312 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-28kfl" event={"ID":"f7b394d3-02cf-407e-a196-7da3e7f759f8","Type":"ContainerStarted","Data":"d3d89df48c533a79b7d3d28aa259563daee0f6f9c43518bde144a64450e4f558"} Mar 14 07:14:03 crc kubenswrapper[4713]: I0314 07:14:03.848583 4713 generic.go:334] "Generic (PLEG): container finished" podID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerID="5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241" exitCode=0 Mar 14 07:14:03 crc kubenswrapper[4713]: I0314 07:14:03.849912 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerDied","Data":"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241"} Mar 14 07:14:03 crc kubenswrapper[4713]: I0314 07:14:03.863692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-28kfl" event={"ID":"f7b394d3-02cf-407e-a196-7da3e7f759f8","Type":"ContainerStarted","Data":"7b365ef42cca0bff73add41f7d0b5647864bcbaff506f91c1c3d53c8ea5ee9a9"} Mar 14 07:14:03 crc kubenswrapper[4713]: I0314 07:14:03.943397 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557874-28kfl" podStartSLOduration=3.18507722 podStartE2EDuration="3.94337764s" podCreationTimestamp="2026-03-14 07:14:00 +0000 UTC" firstStartedPulling="2026-03-14 07:14:01.213235106 +0000 UTC m=+6424.301144406" lastFinishedPulling="2026-03-14 07:14:01.971535526 +0000 UTC m=+6425.059444826" observedRunningTime="2026-03-14 07:14:03.919911626 +0000 UTC m=+6427.007820926" watchObservedRunningTime="2026-03-14 07:14:03.94337764 +0000 UTC m=+6427.031286940" Mar 14 07:14:04 crc kubenswrapper[4713]: I0314 07:14:04.895340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerStarted","Data":"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4"} Mar 14 07:14:04 crc kubenswrapper[4713]: I0314 07:14:04.930506 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-486lp" podStartSLOduration=3.2976091 podStartE2EDuration="7.930486798s" podCreationTimestamp="2026-03-14 07:13:57 +0000 UTC" firstStartedPulling="2026-03-14 07:13:59.746534294 +0000 UTC m=+6422.834443594" lastFinishedPulling="2026-03-14 07:14:04.379411992 +0000 UTC m=+6427.467321292" observedRunningTime="2026-03-14 07:14:04.930351174 +0000 UTC m=+6428.018260484" watchObservedRunningTime="2026-03-14 07:14:04.930486798 +0000 UTC m=+6428.018396098" Mar 14 07:14:05 crc kubenswrapper[4713]: I0314 07:14:05.907647 4713 generic.go:334] "Generic (PLEG): container finished" podID="f7b394d3-02cf-407e-a196-7da3e7f759f8" containerID="7b365ef42cca0bff73add41f7d0b5647864bcbaff506f91c1c3d53c8ea5ee9a9" exitCode=0 Mar 14 07:14:05 crc kubenswrapper[4713]: I0314 07:14:05.907824 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-28kfl" event={"ID":"f7b394d3-02cf-407e-a196-7da3e7f759f8","Type":"ContainerDied","Data":"7b365ef42cca0bff73add41f7d0b5647864bcbaff506f91c1c3d53c8ea5ee9a9"} Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.630633 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.810839 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlkhq\" (UniqueName: \"kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq\") pod \"f7b394d3-02cf-407e-a196-7da3e7f759f8\" (UID: \"f7b394d3-02cf-407e-a196-7da3e7f759f8\") " Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.841619 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq" (OuterVolumeSpecName: "kube-api-access-tlkhq") pod "f7b394d3-02cf-407e-a196-7da3e7f759f8" (UID: "f7b394d3-02cf-407e-a196-7da3e7f759f8"). InnerVolumeSpecName "kube-api-access-tlkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.918022 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlkhq\" (UniqueName: \"kubernetes.io/projected/f7b394d3-02cf-407e-a196-7da3e7f759f8-kube-api-access-tlkhq\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.929908 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-28kfl" event={"ID":"f7b394d3-02cf-407e-a196-7da3e7f759f8","Type":"ContainerDied","Data":"d3d89df48c533a79b7d3d28aa259563daee0f6f9c43518bde144a64450e4f558"} Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.929953 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d89df48c533a79b7d3d28aa259563daee0f6f9c43518bde144a64450e4f558" Mar 14 07:14:07 crc kubenswrapper[4713]: I0314 07:14:07.929984 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-28kfl" Mar 14 07:14:08 crc kubenswrapper[4713]: I0314 07:14:08.039367 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:08 crc kubenswrapper[4713]: I0314 07:14:08.039425 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:08 crc kubenswrapper[4713]: I0314 07:14:08.713900 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-g2hff"] Mar 14 07:14:08 crc kubenswrapper[4713]: I0314 07:14:08.742543 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-g2hff"] Mar 14 07:14:09 crc kubenswrapper[4713]: I0314 07:14:09.100480 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-486lp" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" probeResult="failure" output=< Mar 14 07:14:09 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:14:09 crc kubenswrapper[4713]: > Mar 14 07:14:09 crc kubenswrapper[4713]: I0314 07:14:09.581999 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d08e70-3946-4a9a-9c14-1078698c2383" path="/var/lib/kubelet/pods/c3d08e70-3946-4a9a-9c14-1078698c2383/volumes" Mar 14 07:14:10 crc kubenswrapper[4713]: E0314 07:14:10.425062 4713 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:45732->38.102.83.106:35023: write tcp 38.102.83.106:45732->38.102.83.106:35023: write: broken pipe Mar 14 07:14:19 crc kubenswrapper[4713]: I0314 07:14:19.095200 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-486lp" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" probeResult="failure" output=< Mar 14 07:14:19 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:14:19 crc kubenswrapper[4713]: > Mar 14 07:14:28 crc kubenswrapper[4713]: I0314 07:14:28.163135 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:28 crc kubenswrapper[4713]: I0314 07:14:28.225998 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:28 crc kubenswrapper[4713]: I0314 07:14:28.757460 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.159542 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-486lp" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" containerID="cri-o://885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4" gracePeriod=2 Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.857635 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.958016 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content\") pod \"4a272a43-cc57-42b2-9092-3d738fd9a797\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.958506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities\") pod \"4a272a43-cc57-42b2-9092-3d738fd9a797\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.958846 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-697nm\" (UniqueName: \"kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm\") pod \"4a272a43-cc57-42b2-9092-3d738fd9a797\" (UID: \"4a272a43-cc57-42b2-9092-3d738fd9a797\") " Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.959349 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities" (OuterVolumeSpecName: "utilities") pod "4a272a43-cc57-42b2-9092-3d738fd9a797" (UID: "4a272a43-cc57-42b2-9092-3d738fd9a797"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.960603 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:30 crc kubenswrapper[4713]: I0314 07:14:30.967976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm" (OuterVolumeSpecName: "kube-api-access-697nm") pod "4a272a43-cc57-42b2-9092-3d738fd9a797" (UID: "4a272a43-cc57-42b2-9092-3d738fd9a797"). InnerVolumeSpecName "kube-api-access-697nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.013417 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a272a43-cc57-42b2-9092-3d738fd9a797" (UID: "4a272a43-cc57-42b2-9092-3d738fd9a797"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.063514 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-697nm\" (UniqueName: \"kubernetes.io/projected/4a272a43-cc57-42b2-9092-3d738fd9a797-kube-api-access-697nm\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.063555 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a272a43-cc57-42b2-9092-3d738fd9a797-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.173627 4713 generic.go:334] "Generic (PLEG): container finished" podID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerID="885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4" exitCode=0 Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.173674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerDied","Data":"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4"} Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.173703 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-486lp" event={"ID":"4a272a43-cc57-42b2-9092-3d738fd9a797","Type":"ContainerDied","Data":"019c92a944b7e7b783005b76ee04a0d31ae983b96be4d93105a91d00b89a7331"} Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.173724 4713 scope.go:117] "RemoveContainer" containerID="885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.174861 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-486lp" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.202398 4713 scope.go:117] "RemoveContainer" containerID="5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.218488 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.229629 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-486lp"] Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.239517 4713 scope.go:117] "RemoveContainer" containerID="838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.295401 4713 scope.go:117] "RemoveContainer" containerID="885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4" Mar 14 07:14:31 crc kubenswrapper[4713]: E0314 07:14:31.295989 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4\": container with ID starting with 885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4 not found: ID does not exist" containerID="885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.296138 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4"} err="failed to get container status \"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4\": rpc error: code = NotFound desc = could not find container \"885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4\": container with ID starting with 885d8afc2cf86c28ee437a44ce3c1febb4dbec33a733ee63bfd6197bf29a20d4 not found: ID does not exist" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.296345 4713 scope.go:117] "RemoveContainer" containerID="5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241" Mar 14 07:14:31 crc kubenswrapper[4713]: E0314 07:14:31.296807 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241\": container with ID starting with 5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241 not found: ID does not exist" containerID="5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.296957 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241"} err="failed to get container status \"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241\": rpc error: code = NotFound desc = could not find container \"5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241\": container with ID starting with 5267155f97cac9385830ee9c1eac81519ee9b0997a76808199c689b6d499a241 not found: ID does not exist" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.297076 4713 scope.go:117] "RemoveContainer" containerID="838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235" Mar 14 07:14:31 crc kubenswrapper[4713]: E0314 07:14:31.298576 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235\": container with ID starting with 838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235 not found: ID does not exist" containerID="838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.298633 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235"} err="failed to get container status \"838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235\": rpc error: code = NotFound desc = could not find container \"838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235\": container with ID starting with 838b7f2ed7f500017ad1d2d5fd3381379ba3c38e11fee08fb21f538c64a3a235 not found: ID does not exist" Mar 14 07:14:31 crc kubenswrapper[4713]: I0314 07:14:31.584754 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" path="/var/lib/kubelet/pods/4a272a43-cc57-42b2-9092-3d738fd9a797/volumes" Mar 14 07:14:59 crc kubenswrapper[4713]: I0314 07:14:59.150680 4713 scope.go:117] "RemoveContainer" containerID="defd450a42c66f6869b13ecc59759cd924433325ebb78b7a4edc28f026a44ce9" Mar 14 07:14:59 crc kubenswrapper[4713]: I0314 07:14:59.187292 4713 scope.go:117] "RemoveContainer" containerID="fee625f7ef359ca7f1c872067695ff965ad4ab9c615e972fc59d390d642d1d5c" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.167695 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn"] Mar 14 07:15:00 crc kubenswrapper[4713]: E0314 07:15:00.168544 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.168926 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" Mar 14 07:15:00 crc kubenswrapper[4713]: E0314 07:15:00.168957 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b394d3-02cf-407e-a196-7da3e7f759f8" containerName="oc" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.168971 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b394d3-02cf-407e-a196-7da3e7f759f8" containerName="oc" Mar 14 07:15:00 crc kubenswrapper[4713]: E0314 07:15:00.168994 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="extract-utilities" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.169007 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="extract-utilities" Mar 14 07:15:00 crc kubenswrapper[4713]: E0314 07:15:00.169022 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="extract-content" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.169029 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="extract-content" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.169399 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a272a43-cc57-42b2-9092-3d738fd9a797" containerName="registry-server" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.169429 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b394d3-02cf-407e-a196-7da3e7f759f8" containerName="oc" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.170659 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.174027 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.174642 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.184475 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn"] Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.320893 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdskt\" (UniqueName: \"kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.321043 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.321459 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.424857 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.425094 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdskt\" (UniqueName: \"kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.425159 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.426377 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.433663 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.443637 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdskt\" (UniqueName: \"kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt\") pod \"collect-profiles-29557875-fzdxn\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.501698 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.856916 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.861582 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:00 crc kubenswrapper[4713]: I0314 07:15:00.888441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.042579 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mxv\" (UniqueName: \"kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.042642 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.043236 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.118117 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn"] Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.144970 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mxv\" (UniqueName: \"kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.145354 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.145493 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.145910 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.145990 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.170100 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mxv\" (UniqueName: \"kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv\") pod \"community-operators-ckq9j\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.216506 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.750171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" event={"ID":"bc6c5ccc-6d3a-4851-841d-361a491a0a7e","Type":"ContainerStarted","Data":"2034acf1cc5cb4ee189b221a192b55ec93fb66f2a28f7ebc9e9319a1b403ef27"} Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.750663 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" event={"ID":"bc6c5ccc-6d3a-4851-841d-361a491a0a7e","Type":"ContainerStarted","Data":"1a97c0df4983ff724eff13eeacf96ef1c4ac8174f0de37c09a5e9a26910bcabd"} Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.821054 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" podStartSLOduration=1.8210324500000001 podStartE2EDuration="1.82103245s" podCreationTimestamp="2026-03-14 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:15:01.784263731 +0000 UTC m=+6484.872173041" watchObservedRunningTime="2026-03-14 07:15:01.82103245 +0000 UTC m=+6484.908941740" Mar 14 07:15:01 crc kubenswrapper[4713]: I0314 07:15:01.881851 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:02 crc kubenswrapper[4713]: I0314 07:15:02.763267 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerID="26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca" exitCode=0 Mar 14 07:15:02 crc kubenswrapper[4713]: I0314 07:15:02.763409 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerDied","Data":"26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca"} Mar 14 07:15:02 crc kubenswrapper[4713]: I0314 07:15:02.763917 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerStarted","Data":"134252402b48a7f333768e3e3eea0e429c02af9b66c0d1c6dad7a60a515e2250"} Mar 14 07:15:02 crc kubenswrapper[4713]: I0314 07:15:02.767068 4713 generic.go:334] "Generic (PLEG): container finished" podID="bc6c5ccc-6d3a-4851-841d-361a491a0a7e" containerID="2034acf1cc5cb4ee189b221a192b55ec93fb66f2a28f7ebc9e9319a1b403ef27" exitCode=0 Mar 14 07:15:02 crc kubenswrapper[4713]: I0314 07:15:02.767117 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" event={"ID":"bc6c5ccc-6d3a-4851-841d-361a491a0a7e","Type":"ContainerDied","Data":"2034acf1cc5cb4ee189b221a192b55ec93fb66f2a28f7ebc9e9319a1b403ef27"} Mar 14 07:15:03 crc kubenswrapper[4713]: I0314 07:15:03.779914 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerStarted","Data":"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719"} Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.290569 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.461381 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume\") pod \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.461460 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume\") pod \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.462225 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc6c5ccc-6d3a-4851-841d-361a491a0a7e" (UID: "bc6c5ccc-6d3a-4851-841d-361a491a0a7e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.462333 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdskt\" (UniqueName: \"kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt\") pod \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\" (UID: \"bc6c5ccc-6d3a-4851-841d-361a491a0a7e\") " Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.464416 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.470571 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt" (OuterVolumeSpecName: "kube-api-access-fdskt") pod "bc6c5ccc-6d3a-4851-841d-361a491a0a7e" (UID: "bc6c5ccc-6d3a-4851-841d-361a491a0a7e"). InnerVolumeSpecName "kube-api-access-fdskt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.475162 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc6c5ccc-6d3a-4851-841d-361a491a0a7e" (UID: "bc6c5ccc-6d3a-4851-841d-361a491a0a7e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.590659 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.591241 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdskt\" (UniqueName: \"kubernetes.io/projected/bc6c5ccc-6d3a-4851-841d-361a491a0a7e-kube-api-access-fdskt\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.800706 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" event={"ID":"bc6c5ccc-6d3a-4851-841d-361a491a0a7e","Type":"ContainerDied","Data":"1a97c0df4983ff724eff13eeacf96ef1c4ac8174f0de37c09a5e9a26910bcabd"} Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.802001 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a97c0df4983ff724eff13eeacf96ef1c4ac8174f0de37c09a5e9a26910bcabd" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.801925 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-fzdxn" Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.908742 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt"] Mar 14 07:15:04 crc kubenswrapper[4713]: I0314 07:15:04.925092 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557830-8flzt"] Mar 14 07:15:05 crc kubenswrapper[4713]: I0314 07:15:05.577033 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb531f8c-c460-4477-a1be-26640a8cea40" path="/var/lib/kubelet/pods/bb531f8c-c460-4477-a1be-26640a8cea40/volumes" Mar 14 07:15:07 crc kubenswrapper[4713]: I0314 07:15:07.837758 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerID="030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719" exitCode=0 Mar 14 07:15:07 crc kubenswrapper[4713]: I0314 07:15:07.837816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerDied","Data":"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719"} Mar 14 07:15:08 crc kubenswrapper[4713]: I0314 07:15:08.850777 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerStarted","Data":"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69"} Mar 14 07:15:08 crc kubenswrapper[4713]: I0314 07:15:08.872087 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ckq9j" podStartSLOduration=3.2582316000000002 podStartE2EDuration="8.872065794s" podCreationTimestamp="2026-03-14 07:15:00 +0000 UTC" firstStartedPulling="2026-03-14 07:15:02.766833598 +0000 UTC m=+6485.854742898" lastFinishedPulling="2026-03-14 07:15:08.380667792 +0000 UTC m=+6491.468577092" observedRunningTime="2026-03-14 07:15:08.866885072 +0000 UTC m=+6491.954794372" watchObservedRunningTime="2026-03-14 07:15:08.872065794 +0000 UTC m=+6491.959975094" Mar 14 07:15:10 crc kubenswrapper[4713]: I0314 07:15:10.731898 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:15:10 crc kubenswrapper[4713]: I0314 07:15:10.732363 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:11 crc kubenswrapper[4713]: I0314 07:15:11.217033 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:11 crc kubenswrapper[4713]: I0314 07:15:11.217087 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:12 crc kubenswrapper[4713]: I0314 07:15:12.317497 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ckq9j" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" probeResult="failure" output=< Mar 14 07:15:12 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:15:12 crc kubenswrapper[4713]: > Mar 14 07:15:22 crc kubenswrapper[4713]: I0314 07:15:22.272355 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ckq9j" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" probeResult="failure" output=< Mar 14 07:15:22 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:15:22 crc kubenswrapper[4713]: > Mar 14 07:15:31 crc kubenswrapper[4713]: I0314 07:15:31.273513 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:31 crc kubenswrapper[4713]: I0314 07:15:31.334917 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:32 crc kubenswrapper[4713]: I0314 07:15:32.031650 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:33 crc kubenswrapper[4713]: I0314 07:15:33.140755 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ckq9j" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" containerID="cri-o://4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69" gracePeriod=2 Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.133286 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.164946 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerID="4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69" exitCode=0 Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.164988 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerDied","Data":"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69"} Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.165020 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ckq9j" event={"ID":"c8dc2c05-009c-4fa4-91fa-fec417a71964","Type":"ContainerDied","Data":"134252402b48a7f333768e3e3eea0e429c02af9b66c0d1c6dad7a60a515e2250"} Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.165023 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ckq9j" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.165035 4713 scope.go:117] "RemoveContainer" containerID="4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.188340 4713 scope.go:117] "RemoveContainer" containerID="030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.218357 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities\") pod \"c8dc2c05-009c-4fa4-91fa-fec417a71964\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.218507 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79mxv\" (UniqueName: \"kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv\") pod \"c8dc2c05-009c-4fa4-91fa-fec417a71964\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.218536 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content\") pod \"c8dc2c05-009c-4fa4-91fa-fec417a71964\" (UID: \"c8dc2c05-009c-4fa4-91fa-fec417a71964\") " Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.219166 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities" (OuterVolumeSpecName: "utilities") pod "c8dc2c05-009c-4fa4-91fa-fec417a71964" (UID: "c8dc2c05-009c-4fa4-91fa-fec417a71964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.222686 4713 scope.go:117] "RemoveContainer" containerID="26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.234534 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv" (OuterVolumeSpecName: "kube-api-access-79mxv") pod "c8dc2c05-009c-4fa4-91fa-fec417a71964" (UID: "c8dc2c05-009c-4fa4-91fa-fec417a71964"). InnerVolumeSpecName "kube-api-access-79mxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.285490 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8dc2c05-009c-4fa4-91fa-fec417a71964" (UID: "c8dc2c05-009c-4fa4-91fa-fec417a71964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.321832 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.321860 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79mxv\" (UniqueName: \"kubernetes.io/projected/c8dc2c05-009c-4fa4-91fa-fec417a71964-kube-api-access-79mxv\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.321870 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8dc2c05-009c-4fa4-91fa-fec417a71964-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.356385 4713 scope.go:117] "RemoveContainer" containerID="4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69" Mar 14 07:15:35 crc kubenswrapper[4713]: E0314 07:15:34.356991 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69\": container with ID starting with 4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69 not found: ID does not exist" containerID="4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.357023 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69"} err="failed to get container status \"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69\": rpc error: code = NotFound desc = could not find container \"4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69\": container with ID starting with 4b259e6c8a8060d950bc7763d4f8c45908bdbe94c75146c769ef44cd849d5e69 not found: ID does not exist" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.357043 4713 scope.go:117] "RemoveContainer" containerID="030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719" Mar 14 07:15:35 crc kubenswrapper[4713]: E0314 07:15:34.357486 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719\": container with ID starting with 030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719 not found: ID does not exist" containerID="030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.357511 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719"} err="failed to get container status \"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719\": rpc error: code = NotFound desc = could not find container \"030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719\": container with ID starting with 030fdc920466bba2d62bcac99c53fa830524d472b69560b8660b78e8b49e6719 not found: ID does not exist" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.357528 4713 scope.go:117] "RemoveContainer" containerID="26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca" Mar 14 07:15:35 crc kubenswrapper[4713]: E0314 07:15:34.357875 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca\": container with ID starting with 26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca not found: ID does not exist" containerID="26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.357893 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca"} err="failed to get container status \"26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca\": rpc error: code = NotFound desc = could not find container \"26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca\": container with ID starting with 26deafa260231b8ecefa9836e65d4aac28dce415c5812a8bdcd022c805e785ca not found: ID does not exist" Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.513867 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:34.543491 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ckq9j"] Mar 14 07:15:35 crc kubenswrapper[4713]: I0314 07:15:35.580399 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" path="/var/lib/kubelet/pods/c8dc2c05-009c-4fa4-91fa-fec417a71964/volumes" Mar 14 07:15:40 crc kubenswrapper[4713]: I0314 07:15:40.731925 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:15:40 crc kubenswrapper[4713]: I0314 07:15:40.732482 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:57 crc kubenswrapper[4713]: I0314 07:15:57.541235 4713 generic.go:334] "Generic (PLEG): container finished" podID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerID="2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3" exitCode=0 Mar 14 07:15:57 crc kubenswrapper[4713]: I0314 07:15:57.541340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfn2/must-gather-shpdg" event={"ID":"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b","Type":"ContainerDied","Data":"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3"} Mar 14 07:15:57 crc kubenswrapper[4713]: I0314 07:15:57.543589 4713 scope.go:117] "RemoveContainer" containerID="2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3" Mar 14 07:15:57 crc kubenswrapper[4713]: I0314 07:15:57.797156 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8mfn2_must-gather-shpdg_4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b/gather/0.log" Mar 14 07:15:59 crc kubenswrapper[4713]: I0314 07:15:59.416452 4713 scope.go:117] "RemoveContainer" containerID="a7f79758e81a6492f66974d8884a1505d3baf6d94f72f7e20fbe74e0a1395a15" Mar 14 07:15:59 crc kubenswrapper[4713]: I0314 07:15:59.460223 4713 scope.go:117] "RemoveContainer" containerID="1a766e8cf4550dbd5ecdfaae3e67f4d0255371491a6684ec1acc665a8b9ea767" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.175246 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557876-hs767"] Mar 14 07:16:00 crc kubenswrapper[4713]: E0314 07:16:00.178159 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6c5ccc-6d3a-4851-841d-361a491a0a7e" containerName="collect-profiles" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.178956 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6c5ccc-6d3a-4851-841d-361a491a0a7e" containerName="collect-profiles" Mar 14 07:16:00 crc kubenswrapper[4713]: E0314 07:16:00.179024 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="extract-content" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.179032 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="extract-content" Mar 14 07:16:00 crc kubenswrapper[4713]: E0314 07:16:00.179095 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="extract-utilities" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.179106 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="extract-utilities" Mar 14 07:16:00 crc kubenswrapper[4713]: E0314 07:16:00.179145 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.179154 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.184185 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8dc2c05-009c-4fa4-91fa-fec417a71964" containerName="registry-server" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.184277 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6c5ccc-6d3a-4851-841d-361a491a0a7e" containerName="collect-profiles" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.186882 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.191111 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.191427 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.191466 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.195873 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-hs767"] Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.202271 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zzsd\" (UniqueName: \"kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd\") pod \"auto-csr-approver-29557876-hs767\" (UID: \"741e77d6-23b6-4f92-a568-15de57b9b700\") " pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.306830 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zzsd\" (UniqueName: \"kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd\") pod \"auto-csr-approver-29557876-hs767\" (UID: \"741e77d6-23b6-4f92-a568-15de57b9b700\") " pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.336687 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zzsd\" (UniqueName: \"kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd\") pod \"auto-csr-approver-29557876-hs767\" (UID: \"741e77d6-23b6-4f92-a568-15de57b9b700\") " pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:00 crc kubenswrapper[4713]: I0314 07:16:00.515571 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:01 crc kubenswrapper[4713]: I0314 07:16:01.081278 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-hs767"] Mar 14 07:16:01 crc kubenswrapper[4713]: W0314 07:16:01.092773 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod741e77d6_23b6_4f92_a568_15de57b9b700.slice/crio-a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45 WatchSource:0}: Error finding container a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45: Status 404 returned error can't find the container with id a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45 Mar 14 07:16:01 crc kubenswrapper[4713]: I0314 07:16:01.635716 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-hs767" event={"ID":"741e77d6-23b6-4f92-a568-15de57b9b700","Type":"ContainerStarted","Data":"a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45"} Mar 14 07:16:02 crc kubenswrapper[4713]: I0314 07:16:02.652516 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-hs767" event={"ID":"741e77d6-23b6-4f92-a568-15de57b9b700","Type":"ContainerStarted","Data":"440e593f645e80827f501ed1f8ab7ae12a2a988fd15ffd3c1d19fd53aa90c11f"} Mar 14 07:16:02 crc kubenswrapper[4713]: I0314 07:16:02.684148 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557876-hs767" podStartSLOduration=1.86120339 podStartE2EDuration="2.684120279s" podCreationTimestamp="2026-03-14 07:16:00 +0000 UTC" firstStartedPulling="2026-03-14 07:16:01.095839399 +0000 UTC m=+6544.183748699" lastFinishedPulling="2026-03-14 07:16:01.918756288 +0000 UTC m=+6545.006665588" observedRunningTime="2026-03-14 07:16:02.669928746 +0000 UTC m=+6545.757838046" watchObservedRunningTime="2026-03-14 07:16:02.684120279 +0000 UTC m=+6545.772029579" Mar 14 07:16:04 crc kubenswrapper[4713]: I0314 07:16:04.690268 4713 generic.go:334] "Generic (PLEG): container finished" podID="741e77d6-23b6-4f92-a568-15de57b9b700" containerID="440e593f645e80827f501ed1f8ab7ae12a2a988fd15ffd3c1d19fd53aa90c11f" exitCode=0 Mar 14 07:16:04 crc kubenswrapper[4713]: I0314 07:16:04.690357 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-hs767" event={"ID":"741e77d6-23b6-4f92-a568-15de57b9b700","Type":"ContainerDied","Data":"440e593f645e80827f501ed1f8ab7ae12a2a988fd15ffd3c1d19fd53aa90c11f"} Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.554159 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.678474 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zzsd\" (UniqueName: \"kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd\") pod \"741e77d6-23b6-4f92-a568-15de57b9b700\" (UID: \"741e77d6-23b6-4f92-a568-15de57b9b700\") " Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.695867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd" (OuterVolumeSpecName: "kube-api-access-5zzsd") pod "741e77d6-23b6-4f92-a568-15de57b9b700" (UID: "741e77d6-23b6-4f92-a568-15de57b9b700"). InnerVolumeSpecName "kube-api-access-5zzsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.713789 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-hs767" event={"ID":"741e77d6-23b6-4f92-a568-15de57b9b700","Type":"ContainerDied","Data":"a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45"} Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.713844 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54d24ab4f0702fc64d135965e37eaa64445a2cafaaac23523a8abbfab918c45" Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.713918 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-hs767" Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.774110 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-9jzfw"] Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.782079 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zzsd\" (UniqueName: \"kubernetes.io/projected/741e77d6-23b6-4f92-a568-15de57b9b700-kube-api-access-5zzsd\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:06 crc kubenswrapper[4713]: I0314 07:16:06.788680 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-9jzfw"] Mar 14 07:16:07 crc kubenswrapper[4713]: I0314 07:16:07.580337 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3503770-8264-4526-9916-2fe62e985b1a" path="/var/lib/kubelet/pods/c3503770-8264-4526-9916-2fe62e985b1a/volumes" Mar 14 07:16:07 crc kubenswrapper[4713]: I0314 07:16:07.738007 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8mfn2/must-gather-shpdg"] Mar 14 07:16:07 crc kubenswrapper[4713]: I0314 07:16:07.738393 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8mfn2/must-gather-shpdg" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="copy" containerID="cri-o://10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335" gracePeriod=2 Mar 14 07:16:07 crc kubenswrapper[4713]: I0314 07:16:07.752529 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8mfn2/must-gather-shpdg"] Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.285686 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8mfn2_must-gather-shpdg_4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b/copy/0.log" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.286663 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.438097 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output\") pod \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.438257 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjpgm\" (UniqueName: \"kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm\") pod \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\" (UID: \"4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b\") " Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.447620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm" (OuterVolumeSpecName: "kube-api-access-xjpgm") pod "4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" (UID: "4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b"). InnerVolumeSpecName "kube-api-access-xjpgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.543489 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjpgm\" (UniqueName: \"kubernetes.io/projected/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-kube-api-access-xjpgm\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.581378 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" (UID: "4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.645985 4713 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.745270 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8mfn2_must-gather-shpdg_4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b/copy/0.log" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.748141 4713 generic.go:334] "Generic (PLEG): container finished" podID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerID="10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335" exitCode=143 Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.748244 4713 scope.go:117] "RemoveContainer" containerID="10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.748423 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfn2/must-gather-shpdg" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.781571 4713 scope.go:117] "RemoveContainer" containerID="2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.867710 4713 scope.go:117] "RemoveContainer" containerID="10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335" Mar 14 07:16:08 crc kubenswrapper[4713]: E0314 07:16:08.869267 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335\": container with ID starting with 10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335 not found: ID does not exist" containerID="10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.869314 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335"} err="failed to get container status \"10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335\": rpc error: code = NotFound desc = could not find container \"10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335\": container with ID starting with 10beaa102ed0b04d2f46ca5e0a93f0a62b039ae1aaeddcdee744aa7602e52335 not found: ID does not exist" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.869341 4713 scope.go:117] "RemoveContainer" containerID="2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3" Mar 14 07:16:08 crc kubenswrapper[4713]: E0314 07:16:08.870311 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3\": container with ID starting with 2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3 not found: ID does not exist" containerID="2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3" Mar 14 07:16:08 crc kubenswrapper[4713]: I0314 07:16:08.870336 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3"} err="failed to get container status \"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3\": rpc error: code = NotFound desc = could not find container \"2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3\": container with ID starting with 2f384b31b5c5e8f64ca112824922aa12f51e71db2b4b2e9b478ef55770e8b8e3 not found: ID does not exist" Mar 14 07:16:09 crc kubenswrapper[4713]: I0314 07:16:09.576739 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" path="/var/lib/kubelet/pods/4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b/volumes" Mar 14 07:16:10 crc kubenswrapper[4713]: I0314 07:16:10.731532 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:16:10 crc kubenswrapper[4713]: I0314 07:16:10.731971 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:16:10 crc kubenswrapper[4713]: I0314 07:16:10.732021 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 07:16:10 crc kubenswrapper[4713]: I0314 07:16:10.733080 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:16:10 crc kubenswrapper[4713]: I0314 07:16:10.733155 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a" gracePeriod=600 Mar 14 07:16:11 crc kubenswrapper[4713]: I0314 07:16:11.783313 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a" exitCode=0 Mar 14 07:16:11 crc kubenswrapper[4713]: I0314 07:16:11.783402 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a"} Mar 14 07:16:11 crc kubenswrapper[4713]: I0314 07:16:11.784440 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerStarted","Data":"1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9"} Mar 14 07:16:11 crc kubenswrapper[4713]: I0314 07:16:11.784483 4713 scope.go:117] "RemoveContainer" containerID="8fdb6622f9ef3fa39e8da06052d85c6220d8444ec4955f9f7113b017f100ebc0" Mar 14 07:16:59 crc kubenswrapper[4713]: I0314 07:16:59.641185 4713 scope.go:117] "RemoveContainer" containerID="6adf5afcf9e5a56dc70b7f035b2c30e2aa7e991c4e18eb8bfef832acfcc71d60" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.160401 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557878-g6tv2"] Mar 14 07:18:00 crc kubenswrapper[4713]: E0314 07:18:00.162283 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="copy" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162306 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="copy" Mar 14 07:18:00 crc kubenswrapper[4713]: E0314 07:18:00.162340 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741e77d6-23b6-4f92-a568-15de57b9b700" containerName="oc" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162347 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="741e77d6-23b6-4f92-a568-15de57b9b700" containerName="oc" Mar 14 07:18:00 crc kubenswrapper[4713]: E0314 07:18:00.162390 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="gather" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162397 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="gather" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162697 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="gather" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162713 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="741e77d6-23b6-4f92-a568-15de57b9b700" containerName="oc" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.162734 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be982f6-96cd-4cf2-a3f2-9bf3f15e9f5b" containerName="copy" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.164448 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.168347 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.168514 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.168514 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.177322 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-g6tv2"] Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.212742 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2nl\" (UniqueName: \"kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl\") pod \"auto-csr-approver-29557878-g6tv2\" (UID: \"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba\") " pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.315052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2nl\" (UniqueName: \"kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl\") pod \"auto-csr-approver-29557878-g6tv2\" (UID: \"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba\") " pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.339145 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2nl\" (UniqueName: \"kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl\") pod \"auto-csr-approver-29557878-g6tv2\" (UID: \"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba\") " pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:00 crc kubenswrapper[4713]: I0314 07:18:00.518472 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:01 crc kubenswrapper[4713]: I0314 07:18:01.069499 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-g6tv2"] Mar 14 07:18:01 crc kubenswrapper[4713]: I0314 07:18:01.231675 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" event={"ID":"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba","Type":"ContainerStarted","Data":"4c7b4a939d5cbad123252dcc25cf18738b495054f7b3077a39af193558b2dec1"} Mar 14 07:18:03 crc kubenswrapper[4713]: I0314 07:18:03.304162 4713 generic.go:334] "Generic (PLEG): container finished" podID="a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba" containerID="7b6eec9f99b0ab3f0dc628d22d7770404dba77fd31559adc96b584b50ec01e20" exitCode=0 Mar 14 07:18:03 crc kubenswrapper[4713]: I0314 07:18:03.304265 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" event={"ID":"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba","Type":"ContainerDied","Data":"7b6eec9f99b0ab3f0dc628d22d7770404dba77fd31559adc96b584b50ec01e20"} Mar 14 07:18:04 crc kubenswrapper[4713]: I0314 07:18:04.803027 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:04 crc kubenswrapper[4713]: I0314 07:18:04.876523 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s2nl\" (UniqueName: \"kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl\") pod \"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba\" (UID: \"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba\") " Mar 14 07:18:04 crc kubenswrapper[4713]: I0314 07:18:04.885270 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl" (OuterVolumeSpecName: "kube-api-access-8s2nl") pod "a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba" (UID: "a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba"). InnerVolumeSpecName "kube-api-access-8s2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:18:04 crc kubenswrapper[4713]: I0314 07:18:04.981139 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s2nl\" (UniqueName: \"kubernetes.io/projected/a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba-kube-api-access-8s2nl\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:05 crc kubenswrapper[4713]: I0314 07:18:05.335420 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" event={"ID":"a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba","Type":"ContainerDied","Data":"4c7b4a939d5cbad123252dcc25cf18738b495054f7b3077a39af193558b2dec1"} Mar 14 07:18:05 crc kubenswrapper[4713]: I0314 07:18:05.335459 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-g6tv2" Mar 14 07:18:05 crc kubenswrapper[4713]: I0314 07:18:05.335489 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7b4a939d5cbad123252dcc25cf18738b495054f7b3077a39af193558b2dec1" Mar 14 07:18:05 crc kubenswrapper[4713]: I0314 07:18:05.895410 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-f7jcd"] Mar 14 07:18:05 crc kubenswrapper[4713]: I0314 07:18:05.906874 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-f7jcd"] Mar 14 07:18:07 crc kubenswrapper[4713]: I0314 07:18:07.580974 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46cfb8e-338f-4281-84a1-59c3c42b7841" path="/var/lib/kubelet/pods/b46cfb8e-338f-4281-84a1-59c3c42b7841/volumes" Mar 14 07:18:40 crc kubenswrapper[4713]: I0314 07:18:40.731315 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:18:40 crc kubenswrapper[4713]: I0314 07:18:40.731908 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:18:59 crc kubenswrapper[4713]: I0314 07:18:59.827087 4713 scope.go:117] "RemoveContainer" containerID="630b70f850b6ba80c46b43d28c760bae5130e755d7f04c821de548c78ab9d62a" Mar 14 07:19:10 crc kubenswrapper[4713]: I0314 07:19:10.731862 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:19:10 crc kubenswrapper[4713]: I0314 07:19:10.732534 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:19:40 crc kubenswrapper[4713]: I0314 07:19:40.732042 4713 patch_prober.go:28] interesting pod/machine-config-daemon-ls8z5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:19:40 crc kubenswrapper[4713]: I0314 07:19:40.732699 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:19:40 crc kubenswrapper[4713]: I0314 07:19:40.732757 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" Mar 14 07:19:40 crc kubenswrapper[4713]: I0314 07:19:40.733817 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9"} pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:19:40 crc kubenswrapper[4713]: I0314 07:19:40.733882 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerName="machine-config-daemon" containerID="cri-o://1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" gracePeriod=600 Mar 14 07:19:40 crc kubenswrapper[4713]: E0314 07:19:40.897937 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:19:41 crc kubenswrapper[4713]: I0314 07:19:41.460071 4713 generic.go:334] "Generic (PLEG): container finished" podID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" exitCode=0 Mar 14 07:19:41 crc kubenswrapper[4713]: I0314 07:19:41.460124 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" event={"ID":"c6cc7fbb-a88a-4b94-89bb-1323e0751467","Type":"ContainerDied","Data":"1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9"} Mar 14 07:19:41 crc kubenswrapper[4713]: I0314 07:19:41.460162 4713 scope.go:117] "RemoveContainer" containerID="20996e6149458437a19e27927bb8d7167b3ef0d6b7d55d99f1f5eef50e76a65a" Mar 14 07:19:41 crc kubenswrapper[4713]: I0314 07:19:41.461046 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:19:41 crc kubenswrapper[4713]: E0314 07:19:41.461534 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.118699 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:19:44 crc kubenswrapper[4713]: E0314 07:19:44.121955 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba" containerName="oc" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.121977 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba" containerName="oc" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.122253 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaf0a7-7e07-4df7-9e41-c0c68d3a64ba" containerName="oc" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.124650 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.183362 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftlxd\" (UniqueName: \"kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.183472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.183593 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.192986 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.285496 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.285637 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftlxd\" (UniqueName: \"kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.285706 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.286178 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.286525 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.307300 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftlxd\" (UniqueName: \"kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd\") pod \"redhat-operators-vdjjt\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:44 crc kubenswrapper[4713]: I0314 07:19:44.498531 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:19:45 crc kubenswrapper[4713]: I0314 07:19:45.025138 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:19:45 crc kubenswrapper[4713]: W0314 07:19:45.025543 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43379ecb_7759_4ee9_a761_048127c2f24f.slice/crio-946f592d3ec32f922c104e6a4052394d6266a12bbfd9f6d08c7c3a108876a48e WatchSource:0}: Error finding container 946f592d3ec32f922c104e6a4052394d6266a12bbfd9f6d08c7c3a108876a48e: Status 404 returned error can't find the container with id 946f592d3ec32f922c104e6a4052394d6266a12bbfd9f6d08c7c3a108876a48e Mar 14 07:19:45 crc kubenswrapper[4713]: I0314 07:19:45.518070 4713 generic.go:334] "Generic (PLEG): container finished" podID="43379ecb-7759-4ee9-a761-048127c2f24f" containerID="63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38" exitCode=0 Mar 14 07:19:45 crc kubenswrapper[4713]: I0314 07:19:45.518129 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerDied","Data":"63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38"} Mar 14 07:19:45 crc kubenswrapper[4713]: I0314 07:19:45.519719 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerStarted","Data":"946f592d3ec32f922c104e6a4052394d6266a12bbfd9f6d08c7c3a108876a48e"} Mar 14 07:19:45 crc kubenswrapper[4713]: I0314 07:19:45.520443 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:19:46 crc kubenswrapper[4713]: I0314 07:19:46.561773 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerStarted","Data":"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84"} Mar 14 07:19:52 crc kubenswrapper[4713]: I0314 07:19:52.564375 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:19:52 crc kubenswrapper[4713]: E0314 07:19:52.565193 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:19:53 crc kubenswrapper[4713]: I0314 07:19:53.638113 4713 generic.go:334] "Generic (PLEG): container finished" podID="43379ecb-7759-4ee9-a761-048127c2f24f" containerID="f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84" exitCode=0 Mar 14 07:19:53 crc kubenswrapper[4713]: I0314 07:19:53.638196 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerDied","Data":"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84"} Mar 14 07:19:54 crc kubenswrapper[4713]: I0314 07:19:54.651860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerStarted","Data":"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4"} Mar 14 07:19:54 crc kubenswrapper[4713]: I0314 07:19:54.695673 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vdjjt" podStartSLOduration=2.107198565 podStartE2EDuration="10.695629572s" podCreationTimestamp="2026-03-14 07:19:44 +0000 UTC" firstStartedPulling="2026-03-14 07:19:45.520166698 +0000 UTC m=+6768.608075998" lastFinishedPulling="2026-03-14 07:19:54.108597705 +0000 UTC m=+6777.196507005" observedRunningTime="2026-03-14 07:19:54.667903553 +0000 UTC m=+6777.755812863" watchObservedRunningTime="2026-03-14 07:19:54.695629572 +0000 UTC m=+6777.783538862" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.147351 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557880-rzxnf"] Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.149568 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.154602 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.154744 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.154803 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.160006 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-rzxnf"] Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.227717 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrp4d\" (UniqueName: \"kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d\") pod \"auto-csr-approver-29557880-rzxnf\" (UID: \"075656c0-dac0-400c-97bf-33e343e30cd2\") " pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.330261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrp4d\" (UniqueName: \"kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d\") pod \"auto-csr-approver-29557880-rzxnf\" (UID: \"075656c0-dac0-400c-97bf-33e343e30cd2\") " pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.359067 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrp4d\" (UniqueName: \"kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d\") pod \"auto-csr-approver-29557880-rzxnf\" (UID: \"075656c0-dac0-400c-97bf-33e343e30cd2\") " pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:00 crc kubenswrapper[4713]: I0314 07:20:00.471031 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:01 crc kubenswrapper[4713]: I0314 07:20:01.023263 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-rzxnf"] Mar 14 07:20:01 crc kubenswrapper[4713]: W0314 07:20:01.025452 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod075656c0_dac0_400c_97bf_33e343e30cd2.slice/crio-3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953 WatchSource:0}: Error finding container 3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953: Status 404 returned error can't find the container with id 3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953 Mar 14 07:20:01 crc kubenswrapper[4713]: I0314 07:20:01.416528 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" event={"ID":"075656c0-dac0-400c-97bf-33e343e30cd2","Type":"ContainerStarted","Data":"3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953"} Mar 14 07:20:03 crc kubenswrapper[4713]: I0314 07:20:03.485874 4713 generic.go:334] "Generic (PLEG): container finished" podID="075656c0-dac0-400c-97bf-33e343e30cd2" containerID="efd8db43d9d4c109430c5edac9d3849e888d95b7080a74bc9db2173c2cc4c538" exitCode=0 Mar 14 07:20:03 crc kubenswrapper[4713]: I0314 07:20:03.491677 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" event={"ID":"075656c0-dac0-400c-97bf-33e343e30cd2","Type":"ContainerDied","Data":"efd8db43d9d4c109430c5edac9d3849e888d95b7080a74bc9db2173c2cc4c538"} Mar 14 07:20:04 crc kubenswrapper[4713]: I0314 07:20:04.498821 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:04 crc kubenswrapper[4713]: I0314 07:20:04.501109 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:04 crc kubenswrapper[4713]: I0314 07:20:04.565061 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:20:04 crc kubenswrapper[4713]: E0314 07:20:04.565841 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.053320 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.177434 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrp4d\" (UniqueName: \"kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d\") pod \"075656c0-dac0-400c-97bf-33e343e30cd2\" (UID: \"075656c0-dac0-400c-97bf-33e343e30cd2\") " Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.184145 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d" (OuterVolumeSpecName: "kube-api-access-vrp4d") pod "075656c0-dac0-400c-97bf-33e343e30cd2" (UID: "075656c0-dac0-400c-97bf-33e343e30cd2"). InnerVolumeSpecName "kube-api-access-vrp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.280305 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrp4d\" (UniqueName: \"kubernetes.io/projected/075656c0-dac0-400c-97bf-33e343e30cd2-kube-api-access-vrp4d\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.516175 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.516324 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-rzxnf" event={"ID":"075656c0-dac0-400c-97bf-33e343e30cd2","Type":"ContainerDied","Data":"3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953"} Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.516438 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f891b2a21c4b41ddacef19645eedd0f4d9c3c58568422320f0b247401cf7953" Mar 14 07:20:05 crc kubenswrapper[4713]: I0314 07:20:05.561324 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vdjjt" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" probeResult="failure" output=< Mar 14 07:20:05 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:20:05 crc kubenswrapper[4713]: > Mar 14 07:20:06 crc kubenswrapper[4713]: I0314 07:20:06.135563 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-28kfl"] Mar 14 07:20:06 crc kubenswrapper[4713]: I0314 07:20:06.146327 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-28kfl"] Mar 14 07:20:07 crc kubenswrapper[4713]: I0314 07:20:07.594085 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b394d3-02cf-407e-a196-7da3e7f759f8" path="/var/lib/kubelet/pods/f7b394d3-02cf-407e-a196-7da3e7f759f8/volumes" Mar 14 07:20:15 crc kubenswrapper[4713]: I0314 07:20:15.553862 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vdjjt" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" probeResult="failure" output=< Mar 14 07:20:15 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:20:15 crc kubenswrapper[4713]: > Mar 14 07:20:19 crc kubenswrapper[4713]: I0314 07:20:19.564518 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:20:19 crc kubenswrapper[4713]: E0314 07:20:19.565502 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:20:25 crc kubenswrapper[4713]: I0314 07:20:25.553740 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vdjjt" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" probeResult="failure" output=< Mar 14 07:20:25 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:20:25 crc kubenswrapper[4713]: > Mar 14 07:20:33 crc kubenswrapper[4713]: I0314 07:20:33.564041 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:20:33 crc kubenswrapper[4713]: E0314 07:20:33.564923 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:20:35 crc kubenswrapper[4713]: I0314 07:20:35.552492 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vdjjt" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" probeResult="failure" output=< Mar 14 07:20:35 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Mar 14 07:20:35 crc kubenswrapper[4713]: > Mar 14 07:20:44 crc kubenswrapper[4713]: I0314 07:20:44.575062 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:44 crc kubenswrapper[4713]: I0314 07:20:44.669258 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:45 crc kubenswrapper[4713]: I0314 07:20:45.346518 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:20:45 crc kubenswrapper[4713]: I0314 07:20:45.967121 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vdjjt" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" containerID="cri-o://64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4" gracePeriod=2 Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.568724 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:20:46 crc kubenswrapper[4713]: E0314 07:20:46.576718 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.795147 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.982013 4713 generic.go:334] "Generic (PLEG): container finished" podID="43379ecb-7759-4ee9-a761-048127c2f24f" containerID="64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4" exitCode=0 Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.982088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerDied","Data":"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4"} Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.982113 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdjjt" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.982138 4713 scope.go:117] "RemoveContainer" containerID="64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.982125 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdjjt" event={"ID":"43379ecb-7759-4ee9-a761-048127c2f24f","Type":"ContainerDied","Data":"946f592d3ec32f922c104e6a4052394d6266a12bbfd9f6d08c7c3a108876a48e"} Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.987103 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content\") pod \"43379ecb-7759-4ee9-a761-048127c2f24f\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.987236 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities\") pod \"43379ecb-7759-4ee9-a761-048127c2f24f\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.987555 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftlxd\" (UniqueName: \"kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd\") pod \"43379ecb-7759-4ee9-a761-048127c2f24f\" (UID: \"43379ecb-7759-4ee9-a761-048127c2f24f\") " Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.987930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities" (OuterVolumeSpecName: "utilities") pod "43379ecb-7759-4ee9-a761-048127c2f24f" (UID: "43379ecb-7759-4ee9-a761-048127c2f24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.988504 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:46 crc kubenswrapper[4713]: I0314 07:20:46.997781 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd" (OuterVolumeSpecName: "kube-api-access-ftlxd") pod "43379ecb-7759-4ee9-a761-048127c2f24f" (UID: "43379ecb-7759-4ee9-a761-048127c2f24f"). InnerVolumeSpecName "kube-api-access-ftlxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.012929 4713 scope.go:117] "RemoveContainer" containerID="f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.079942 4713 scope.go:117] "RemoveContainer" containerID="63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.090410 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftlxd\" (UniqueName: \"kubernetes.io/projected/43379ecb-7759-4ee9-a761-048127c2f24f-kube-api-access-ftlxd\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.138760 4713 scope.go:117] "RemoveContainer" containerID="64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4" Mar 14 07:20:47 crc kubenswrapper[4713]: E0314 07:20:47.140825 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4\": container with ID starting with 64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4 not found: ID does not exist" containerID="64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.140887 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4"} err="failed to get container status \"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4\": rpc error: code = NotFound desc = could not find container \"64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4\": container with ID starting with 64f70d848e3da9f6c6679c9445ff80a0f57bd5d0df95b255fa6f3d90dee5d4a4 not found: ID does not exist" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.140923 4713 scope.go:117] "RemoveContainer" containerID="f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84" Mar 14 07:20:47 crc kubenswrapper[4713]: E0314 07:20:47.141677 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84\": container with ID starting with f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84 not found: ID does not exist" containerID="f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.141722 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84"} err="failed to get container status \"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84\": rpc error: code = NotFound desc = could not find container \"f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84\": container with ID starting with f8b76e4ed2b62f7618708a073c9b6c15513718497c0461e58c55bf2a69ae9a84 not found: ID does not exist" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.141753 4713 scope.go:117] "RemoveContainer" containerID="63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38" Mar 14 07:20:47 crc kubenswrapper[4713]: E0314 07:20:47.142015 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38\": container with ID starting with 63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38 not found: ID does not exist" containerID="63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.142048 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38"} err="failed to get container status \"63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38\": rpc error: code = NotFound desc = could not find container \"63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38\": container with ID starting with 63b9607dd4c22baf5db73d290ac5e6c969f59c7dd17e889fa3c8605657309e38 not found: ID does not exist" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.156531 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43379ecb-7759-4ee9-a761-048127c2f24f" (UID: "43379ecb-7759-4ee9-a761-048127c2f24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.193899 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43379ecb-7759-4ee9-a761-048127c2f24f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.326604 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.340956 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vdjjt"] Mar 14 07:20:47 crc kubenswrapper[4713]: I0314 07:20:47.581728 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" path="/var/lib/kubelet/pods/43379ecb-7759-4ee9-a761-048127c2f24f/volumes" Mar 14 07:20:59 crc kubenswrapper[4713]: I0314 07:20:59.569408 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:20:59 crc kubenswrapper[4713]: E0314 07:20:59.570819 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:20:59 crc kubenswrapper[4713]: I0314 07:20:59.968823 4713 scope.go:117] "RemoveContainer" containerID="7b365ef42cca0bff73add41f7d0b5647864bcbaff506f91c1c3d53c8ea5ee9a9" Mar 14 07:21:12 crc kubenswrapper[4713]: I0314 07:21:12.563896 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:21:12 crc kubenswrapper[4713]: E0314 07:21:12.564729 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:21:26 crc kubenswrapper[4713]: I0314 07:21:26.563456 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:21:26 crc kubenswrapper[4713]: E0314 07:21:26.564642 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:21:37 crc kubenswrapper[4713]: I0314 07:21:37.573668 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:21:37 crc kubenswrapper[4713]: E0314 07:21:37.574577 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:21:50 crc kubenswrapper[4713]: I0314 07:21:50.565687 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:21:50 crc kubenswrapper[4713]: E0314 07:21:50.566691 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.170613 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557882-pf62h"] Mar 14 07:22:00 crc kubenswrapper[4713]: E0314 07:22:00.172084 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="extract-utilities" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172101 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="extract-utilities" Mar 14 07:22:00 crc kubenswrapper[4713]: E0314 07:22:00.172112 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172119 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" Mar 14 07:22:00 crc kubenswrapper[4713]: E0314 07:22:00.172133 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="extract-content" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172140 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="extract-content" Mar 14 07:22:00 crc kubenswrapper[4713]: E0314 07:22:00.172167 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075656c0-dac0-400c-97bf-33e343e30cd2" containerName="oc" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172174 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="075656c0-dac0-400c-97bf-33e343e30cd2" containerName="oc" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172421 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="075656c0-dac0-400c-97bf-33e343e30cd2" containerName="oc" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.172439 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="43379ecb-7759-4ee9-a761-048127c2f24f" containerName="registry-server" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.173480 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.176272 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-456gs" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.176536 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.177037 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.189248 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-pf62h"] Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.262666 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdfn\" (UniqueName: \"kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn\") pod \"auto-csr-approver-29557882-pf62h\" (UID: \"c077d0d2-4016-467d-b732-41192b5b27a7\") " pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.367458 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdfn\" (UniqueName: \"kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn\") pod \"auto-csr-approver-29557882-pf62h\" (UID: \"c077d0d2-4016-467d-b732-41192b5b27a7\") " pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.402531 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdfn\" (UniqueName: \"kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn\") pod \"auto-csr-approver-29557882-pf62h\" (UID: \"c077d0d2-4016-467d-b732-41192b5b27a7\") " pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:00 crc kubenswrapper[4713]: I0314 07:22:00.505396 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:01 crc kubenswrapper[4713]: I0314 07:22:01.026570 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-pf62h"] Mar 14 07:22:01 crc kubenswrapper[4713]: I0314 07:22:01.951220 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-pf62h" event={"ID":"c077d0d2-4016-467d-b732-41192b5b27a7","Type":"ContainerStarted","Data":"0b74be46e3cd79d91145073c175bb60196e2443f5f06f24a72ff9ecd43f51f10"} Mar 14 07:22:02 crc kubenswrapper[4713]: I0314 07:22:02.566077 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:22:02 crc kubenswrapper[4713]: E0314 07:22:02.566930 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:22:04 crc kubenswrapper[4713]: I0314 07:22:04.987716 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-pf62h" event={"ID":"c077d0d2-4016-467d-b732-41192b5b27a7","Type":"ContainerStarted","Data":"d80398bc6a1013010264b2c7cc1e3c5ac351a79ed3ae74457029ce22ed5932c0"} Mar 14 07:22:05 crc kubenswrapper[4713]: I0314 07:22:05.017876 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557882-pf62h" podStartSLOduration=2.445105201 podStartE2EDuration="5.017852892s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="2026-03-14 07:22:01.031547929 +0000 UTC m=+6904.119457229" lastFinishedPulling="2026-03-14 07:22:03.60429562 +0000 UTC m=+6906.692204920" observedRunningTime="2026-03-14 07:22:05.009099737 +0000 UTC m=+6908.097009037" watchObservedRunningTime="2026-03-14 07:22:05.017852892 +0000 UTC m=+6908.105762192" Mar 14 07:22:08 crc kubenswrapper[4713]: I0314 07:22:08.045347 4713 generic.go:334] "Generic (PLEG): container finished" podID="c077d0d2-4016-467d-b732-41192b5b27a7" containerID="d80398bc6a1013010264b2c7cc1e3c5ac351a79ed3ae74457029ce22ed5932c0" exitCode=0 Mar 14 07:22:08 crc kubenswrapper[4713]: I0314 07:22:08.047430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-pf62h" event={"ID":"c077d0d2-4016-467d-b732-41192b5b27a7","Type":"ContainerDied","Data":"d80398bc6a1013010264b2c7cc1e3c5ac351a79ed3ae74457029ce22ed5932c0"} Mar 14 07:22:09 crc kubenswrapper[4713]: I0314 07:22:09.540163 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:09 crc kubenswrapper[4713]: I0314 07:22:09.674990 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxdfn\" (UniqueName: \"kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn\") pod \"c077d0d2-4016-467d-b732-41192b5b27a7\" (UID: \"c077d0d2-4016-467d-b732-41192b5b27a7\") " Mar 14 07:22:09 crc kubenswrapper[4713]: I0314 07:22:09.682793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn" (OuterVolumeSpecName: "kube-api-access-nxdfn") pod "c077d0d2-4016-467d-b732-41192b5b27a7" (UID: "c077d0d2-4016-467d-b732-41192b5b27a7"). InnerVolumeSpecName "kube-api-access-nxdfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:09 crc kubenswrapper[4713]: I0314 07:22:09.780245 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxdfn\" (UniqueName: \"kubernetes.io/projected/c077d0d2-4016-467d-b732-41192b5b27a7-kube-api-access-nxdfn\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[4713]: I0314 07:22:10.071790 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-pf62h" event={"ID":"c077d0d2-4016-467d-b732-41192b5b27a7","Type":"ContainerDied","Data":"0b74be46e3cd79d91145073c175bb60196e2443f5f06f24a72ff9ecd43f51f10"} Mar 14 07:22:10 crc kubenswrapper[4713]: I0314 07:22:10.071855 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b74be46e3cd79d91145073c175bb60196e2443f5f06f24a72ff9ecd43f51f10" Mar 14 07:22:10 crc kubenswrapper[4713]: I0314 07:22:10.071985 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-pf62h" Mar 14 07:22:10 crc kubenswrapper[4713]: I0314 07:22:10.145190 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-hs767"] Mar 14 07:22:10 crc kubenswrapper[4713]: I0314 07:22:10.156059 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-hs767"] Mar 14 07:22:11 crc kubenswrapper[4713]: I0314 07:22:11.577815 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741e77d6-23b6-4f92-a568-15de57b9b700" path="/var/lib/kubelet/pods/741e77d6-23b6-4f92-a568-15de57b9b700/volumes" Mar 14 07:22:16 crc kubenswrapper[4713]: I0314 07:22:16.567104 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:22:16 crc kubenswrapper[4713]: E0314 07:22:16.569335 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" Mar 14 07:22:28 crc kubenswrapper[4713]: I0314 07:22:28.568080 4713 scope.go:117] "RemoveContainer" containerID="1fdce240bf334565bb1333cbe7a7eca8530769c56bfbbdd29fb6feb530ca77d9" Mar 14 07:22:28 crc kubenswrapper[4713]: E0314 07:22:28.569531 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ls8z5_openshift-machine-config-operator(c6cc7fbb-a88a-4b94-89bb-1323e0751467)\"" pod="openshift-machine-config-operator/machine-config-daemon-ls8z5" podUID="c6cc7fbb-a88a-4b94-89bb-1323e0751467" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155206305024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155206306017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155170263016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155170263015462 5ustar corecore